Home

Yet Another Automatic Differentiation

YAAD.argFunction.
arg(node, i) -> ArgumentType

Returns the i-th argument of the call in node.

YAAD.argsFunction.
args(node) -> Tuple

Returns the arguments of the call in node.

YAAD.backwardFunction.
backward(node) -> nothing

Backward evaluation of the comput-graph.

YAAD.forwardFunction.
forward(node) -> output

Forward evaluation of the comput-graph. This method will call the operator in the comput-graph and update the cache.

forward(f, args...) -> output

For function calls.

YAAD.gradientFunction.
gradient(node, grad)

Returns the gradient.

YAAD.operatorFunction.
operator(node) -> YAAD.Operator

Returns the operator called in this node.

YAAD.registerMethod.
register(f, args...; kwargs...)

This is just a alias for constructing a CachedNode. But notice this function is used for register a node in tape in the global tape version implementation:

https://github.com/Roger-luo/YAAD.jl/tree/tape

YAAD.valueFunction.
value(node)

Returns the value when forwarding at current node. value is different than forward method, value only returns what the node contains, it will throw an error, if this node does not contain anything.

AbstractNode

Abstract type for nodes in computation graph.

YAAD.CachedNodeType.
CachedNode{NT, OutT} <: AbstractNode

Stores the cache of output with type OutT from a node of type NT in comput-graph. CachedNode is mutable, its output can be updated by forward.

YAAD.LeafNodeType.
LeafNode <: AbstractNode

Abstract type for leaf nodes in a computation graph.

YAAD.NodeType.
Node{FT, ArgsT} <: AbstractNode

General node in a comput-graph. It stores a callable operator f of type FT and its arguments args in type ArgsT which should be a tuple.

YAAD.VariableType.
Variable{T} <: LeafNode

A kind of leaf node. A general type for variables in a comput-graph. Similar to PyTorch's Variable, gradient will be accumulated to var.grad.

YAAD.PrintTraitFunction.
PrintTrait(node) -> Trait
backward_type_assert(node, grad)

throw more readable error msg for backward type check.

YAAD.kwargsFunction.
kwargs(node) -> NamedTuple

Returns the keyword arguements of the call in node.

YAAD.uncatMethod.
uncat(dims, cat_output, xs...) -> Vector{SubArray}

The reverse operation of [Base.cat], it will return corresponding [Base.view] of the inputs of a cat.

ComputGraphStyle <: Broadcast.BroadcastStyle

This style of broadcast will forward the broadcast expression to be registered in a computation graph, rather than directly calculate it.

YAAD.OperatorType.
Operator

Abstract type for operators in the computation graph.

Batched Operations

YAAD.BatchedModule.

Batched operation in Julia.

This module wraps some useful batched operation with a plain for-loop on CPU. All the functions in this module are defined with gradients in YAAD.

ScalarIdentity{B, K, T} <: AbstractArray{T, 3}

A batch of scalar multiplies a batch of identities, where batch size is B, each identity's size is K.

Transpose{B, T, AT <: AbstractArray{T, 3}} <: AbstractArray{T, 3}

Batched transpose. Transpose a batch of matrix.

Operator Traits

YAAD.TraitModule.
Trait

This module contains function traits as a subtype of Operator.

Broadcasted{FT} <: Operator

This trait wraps a callable object that being broadcasted. It will help to dispatch different gradient methods overloaded for broadcasted operation comparing to Method.

Method{FT} <: Operator

This trait wraps a callable object in Julia (usually a Function).