Yet Another Automatic Differentiation
YAAD.arg — Function.arg(node, i) -> ArgumentTypeReturns the i-th argument of the call in node.
YAAD.args — Function.args(node) -> TupleReturns the arguments of the call in node.
YAAD.backward — Function.backward(node) -> nothingBackward evaluation of the comput-graph.
YAAD.forward — Function.forward(node) -> outputForward evaluation of the comput-graph. This method will call the operator in the comput-graph and update the cache.
forward(f, args...) -> outputFor function calls.
YAAD.gradient — Function.gradient(node, grad)Returns the gradient.
YAAD.operator — Function.operator(node) -> YAAD.OperatorReturns the operator called in this node.
YAAD.register — Method.register(f, args...; kwargs...)This is just a alias for constructing a CachedNode. But notice this function is used for register a node in tape in the global tape version implementation:
https://github.com/Roger-luo/YAAD.jl/tree/tape
YAAD.register_parameters — Function.register_parameters(x::OperatorType) -> iteratorReturns the iterator of all parameters in the instance x of OperatorType. Note, here, OperatorType does not need to be subtype of Operator.
YAAD.value — Function.value(node)Returns the value when forwarding at current node. value is different than forward method, value only returns what the node contains, it will throw an error, if this node does not contain anything.
YAAD.zero_grad! — Function.zero_grad!(var)clear gradient storage in the whole comput-graph.
YAAD.AbstractArrayVariable — Type.AbstractArrayVariable{T, N}Alias for AbstractVariable, abstract type for variables contains an array.
YAAD.AbstractMatrixVariable — Type.AbstractMatrixVariable{T}Abstract type for variables contains a matrix. See AbstractVariable for more.
YAAD.AbstractNode — Type.AbstractNodeAbstract type for nodes in computation graph.
YAAD.AbstractVariable — Type.AbstractVariable{T} <: Value{T}Abstract type for variables, variables are types that contains value and gradients.
YAAD.AbstractVectorVariable — Type.AbstractVectorVariable{T}Abstract type for variables contains a vector. See AbstractVariable for more.
YAAD.CachedNode — Type.CachedNode{NT, OutT} <: AbstractNodeStores the cache of output with type OutT from a node of type NT in comput-graph. CachedNode is mutable, its output can be updated by forward.
YAAD.Node — Type.Node{FT, ArgsT} <: AbstractNodeGeneral node in a comput-graph. It stores a callable operator f of type FT and its arguments args in type ArgsT which should be a tuple.
YAAD.Value — Type.Value{T} <: AbstractNodeAbstract type for nodes contains a value in a computation graph.
YAAD.Variable — Type.Variable{T} <: Value{T}A kind of leaf node. A general type for variables in a comput-graph. Similar to PyTorch's Variable, gradient will be accumulated to var.grad.
YAAD.PrintTrait — Function.PrintTrait(node) -> TraitYAAD.backward_type_assert — Function.backward_type_assert(node, grad)throw more readable error msg for backward type check.
YAAD.forward! — Function.forward!(output, ...) -> outputYAAD.kwargs — Function.kwargs(node) -> NamedTupleReturns the keyword arguements of the call in node.
YAAD.uncat — Method.uncat(dims, cat_output, xs...) -> Vector{SubArray}The reverse operation of [Base.cat], it will return corresponding [Base.view] of the inputs of a cat.
YAAD.ComputGraphStyle — Type.ComputGraphStyle <: Broadcast.BroadcastStyleThis style of broadcast will forward the broadcast expression to be registered in a computation graph, rather than directly calculate it.
YAAD.Operator — Type.OperatorAbstract type for operators in the computation graph.
Batched Operations
Modules = [YAAD.Batched]Operator Traits
YAAD.Trait — Module.TraitThis module contains function traits as a subtype of Operator.
YAAD.Trait.Broadcasted — Type.Broadcasted{FT} <: OperatorThis trait wraps a callable object that being broadcasted. It will help to dispatch different gradient methods overloaded for broadcasted operation comparing to Method.
YAAD.Trait.Method — Type.Method{FT} <: OperatorThis trait wraps a callable object in Julia (usually a Function).