Yet Another Automatic Differentiation
YAAD.arg
— Function.arg(node, i) -> ArgumentType
Returns the i
-th argument of the call in node
.
YAAD.args
— Function.args(node) -> Tuple
Returns the arguments of the call in node
.
YAAD.backward
— Function.backward(node) -> nothing
Backward evaluation of the comput-graph.
YAAD.forward
— Function.forward(node) -> output
Forward evaluation of the comput-graph. This method will call the operator in the comput-graph and update the cache.
forward(f, args...) -> output
For function calls.
YAAD.gradient
— Function.gradient(node, grad)
Returns the gradient.
YAAD.operator
— Function.operator(node) -> YAAD.Operator
Returns the operator called in this node.
YAAD.register
— Method.register(f, args...; kwargs...)
This is just a alias for constructing a CachedNode
. But notice this function is used for register a node in tape
in the global tape version implementation:
https://github.com/Roger-luo/YAAD.jl/tree/tape
YAAD.value
— Function.value(node)
Returns the value when forwarding at current node. value
is different than forward
method, value
only returns what the node contains, it will throw an error, if this node does not contain anything.
YAAD.AbstractNode
— Type.AbstractNode
Abstract type for nodes in computation graph.
YAAD.CachedNode
— Type.CachedNode{NT, OutT} <: AbstractNode
Stores the cache of output with type OutT
from a node of type NT
in comput-graph. CachedNode is mutable, its output can be updated by forward
.
YAAD.LeafNode
— Type.LeafNode <: AbstractNode
Abstract type for leaf nodes in a computation graph.
YAAD.Node
— Type.Node{FT, ArgsT} <: AbstractNode
General node in a comput-graph. It stores a callable operator f
of type FT
and its arguments args
in type ArgsT
which should be a tuple.
YAAD.Variable
— Type.Variable{T} <: LeafNode
A kind of leaf node. A general type for variables in a comput-graph. Similar to PyTorch's Variable, gradient will be accumulated to var.grad
.
YAAD.PrintTrait
— Function.PrintTrait(node) -> Trait
YAAD.backward_type_assert
— Function.backward_type_assert(node, grad)
throw more readable error msg for backward type check.
YAAD.kwargs
— Function.kwargs(node) -> NamedTuple
Returns the keyword arguements of the call in node
.
YAAD.uncat
— Method.uncat(dims, cat_output, xs...) -> Vector{SubArray}
The reverse operation of [Base.cat
], it will return corresponding [Base.view
] of the inputs of a cat
.
YAAD.ComputGraphStyle
— Type.ComputGraphStyle <: Broadcast.BroadcastStyle
This style of broadcast will forward the broadcast expression to be registered in a computation graph, rather than directly calculate it.
YAAD.Operator
— Type.Operator
Abstract type for operators in the computation graph.
Batched Operations
YAAD.Batched
— Module.Batched operation in Julia.
This module wraps some useful batched operation with a plain for-loop on CPU. All the functions in this module are defined with gradients in YAAD.
YAAD.Batched.ScalarIdentity
— Type.ScalarIdentity{B, K, T} <: AbstractArray{T, 3}
A batch of scalar multiplies a batch of identities, where batch size is B
, each identity's size is K
.
YAAD.Batched.Transpose
— Type.Transpose{B, T, AT <: AbstractArray{T, 3}} <: AbstractArray{T, 3}
Batched transpose. Transpose a batch of matrix.
Operator Traits
YAAD.Trait
— Module.Trait
This module contains function traits as a subtype of Operator
.
YAAD.Trait.Broadcasted
— Type.Broadcasted{FT} <: Operator
This trait wraps a callable object that being broadcasted. It will help to dispatch different gradient methods overloaded for broadcasted operation comparing to Method
.
YAAD.Trait.Method
— Type.Method{FT} <: Operator
This trait wraps a callable object in Julia (usually a Function
).