Yet Another Automatic Differentiation
YAAD.arg — Function.arg(node, i) -> ArgumentTypeReturns the i-th argument of the call in node.
YAAD.args — Function.args(node) -> TupleReturns the arguments of the call in node.
YAAD.backward — Function.backward(node) -> nothingBackward evaluation of the comput-graph.
YAAD.forward — Function.forward(node) -> outputForward evaluation of the comput-graph. This method will call the operator in the comput-graph and update the cache.
forward(f, args...) -> outputFor function calls.
YAAD.gradient — Function.gradient(node, grad)Returns the gradient.
YAAD.operator — Function.operator(node) -> YAAD.OperatorReturns the operator called in this node.
YAAD.register — Method.register(f, args...; kwargs...)This is just a alias for constructing a CachedNode. But notice this function is used for register a node in tape in the global tape version implementation:
https://github.com/Roger-luo/YAAD.jl/tree/tape
YAAD.value — Function.value(node)Returns the value when forwarding at current node. value is different than forward method, value only returns what the node contains, it will throw an error, if this node does not contain anything.
YAAD.AbstractNode — Type.AbstractNodeAbstract type for nodes in computation graph.
YAAD.CachedNode — Type.CachedNode{NT, OutT} <: AbstractNodeStores the cache of output with type OutT from a node of type NT in comput-graph. CachedNode is mutable, its output can be updated by forward.
YAAD.LeafNode — Type.LeafNode <: AbstractNodeAbstract type for leaf nodes in a computation graph.
YAAD.Node — Type.Node{FT, ArgsT} <: AbstractNodeGeneral node in a comput-graph. It stores a callable operator f of type FT and its arguments args in type ArgsT which should be a tuple.
YAAD.Variable — Type.Variable{T} <: LeafNodeA kind of leaf node. A general type for variables in a comput-graph. Similar to PyTorch's Variable, gradient will be accumulated to var.grad.
YAAD.PrintTrait — Function.PrintTrait(node) -> TraitYAAD.backward_type_assert — Function.backward_type_assert(node, grad)throw more readable error msg for backward type check.
YAAD.kwargs — Function.kwargs(node) -> NamedTupleReturns the keyword arguements of the call in node.
YAAD.uncat — Method.uncat(dims, cat_output, xs...) -> Vector{SubArray}The reverse operation of [Base.cat], it will return corresponding [Base.view] of the inputs of a cat.
YAAD.ComputGraphStyle — Type.ComputGraphStyle <: Broadcast.BroadcastStyleThis style of broadcast will forward the broadcast expression to be registered in a computation graph, rather than directly calculate it.
YAAD.Operator — Type.OperatorAbstract type for operators in the computation graph.
Batched Operations
YAAD.Batched — Module.Batched operation in Julia.
This module wraps some useful batched operation with a plain for-loop on CPU. All the functions in this module are defined with gradients in YAAD.
YAAD.Batched.ScalarIdentity — Type.ScalarIdentity{B, K, T} <: AbstractArray{T, 3}A batch of scalar multiplies a batch of identities, where batch size is B, each identity's size is K.
YAAD.Batched.Transpose — Type.Transpose{B, T, AT <: AbstractArray{T, 3}} <: AbstractArray{T, 3}Batched transpose. Transpose a batch of matrix.
Operator Traits
YAAD.Trait — Module.TraitThis module contains function traits as a subtype of Operator.
YAAD.Trait.Broadcasted — Type.Broadcasted{FT} <: OperatorThis trait wraps a callable object that being broadcasted. It will help to dispatch different gradient methods overloaded for broadcasted operation comparing to Method.
YAAD.Trait.Method — Type.Method{FT} <: OperatorThis trait wraps a callable object in Julia (usually a Function).