Home

Yet Another Automatic Differentiation

YAAD.argFunction.
arg(node, i) -> ArgumentType

Returns the i-th argument of the call in node.

YAAD.argsFunction.
args(node) -> Tuple

Returns the arguments of the call in node.

YAAD.backwardFunction.
backward(node) -> nothing

Backward evaluation of the comput-graph.

YAAD.forwardFunction.
forward(node) -> output

Forward evaluation of the comput-graph. This method will call the operator in the comput-graph and update the cache.

forward(f, args...) -> output

For function calls.

YAAD.gradientFunction.
gradient(node, grad)

Returns the gradient.

YAAD.operatorFunction.
operator(node) -> YAAD.Operator

Returns the operator called in this node.

YAAD.registerMethod.
register(f, args...; kwargs...)

This is just a alias for constructing a CachedNode. But notice this function is used for register a node in tape in the global tape version implementation:

https://github.com/Roger-luo/YAAD.jl/tree/tape

register_parameters(x::OperatorType) -> iterator

Returns the iterator of all parameters in the instance x of OperatorType. Note, here, OperatorType does not need to be subtype of Operator.

YAAD.valueFunction.
value(node)

Returns the value when forwarding at current node. value is different than forward method, value only returns what the node contains, it will throw an error, if this node does not contain anything.

YAAD.zero_grad!Function.
zero_grad!(var)

clear gradient storage in the whole comput-graph.

AbstractArrayVariable{T, N}

Alias for AbstractVariable, abstract type for variables contains an array.

AbstractMatrixVariable{T}

Abstract type for variables contains a matrix. See AbstractVariable for more.

AbstractNode

Abstract type for nodes in computation graph.

AbstractVariable{T} <: Value{T}

Abstract type for variables, variables are types that contains value and gradients.

AbstractVectorVariable{T}

Abstract type for variables contains a vector. See AbstractVariable for more.

YAAD.CachedNodeType.
CachedNode{NT, OutT} <: AbstractNode

Stores the cache of output with type OutT from a node of type NT in comput-graph. CachedNode is mutable, its output can be updated by forward.

YAAD.NodeType.
Node{FT, ArgsT} <: AbstractNode

General node in a comput-graph. It stores a callable operator f of type FT and its arguments args in type ArgsT which should be a tuple.

YAAD.ValueType.
Value{T} <: AbstractNode

Abstract type for nodes contains a value in a computation graph.

YAAD.VariableType.
Variable{T} <: Value{T}

A kind of leaf node. A general type for variables in a comput-graph. Similar to PyTorch's Variable, gradient will be accumulated to var.grad.

YAAD.PrintTraitFunction.
PrintTrait(node) -> Trait
backward_type_assert(node, grad)

throw more readable error msg for backward type check.

YAAD.forward!Function.
forward!(output, ...) -> output
YAAD.kwargsFunction.
kwargs(node) -> NamedTuple

Returns the keyword arguements of the call in node.

YAAD.uncatMethod.
uncat(dims, cat_output, xs...) -> Vector{SubArray}

The reverse operation of [Base.cat], it will return corresponding [Base.view] of the inputs of a cat.

ComputGraphStyle <: Broadcast.BroadcastStyle

This style of broadcast will forward the broadcast expression to be registered in a computation graph, rather than directly calculate it.

YAAD.OperatorType.
Operator

Abstract type for operators in the computation graph.

Batched Operations

Modules = [YAAD.Batched]

Operator Traits

YAAD.TraitModule.
Trait

This module contains function traits as a subtype of Operator.

Broadcasted{FT} <: Operator

This trait wraps a callable object that being broadcasted. It will help to dispatch different gradient methods overloaded for broadcasted operation comparing to Method.

Method{FT} <: Operator

This trait wraps a callable object in Julia (usually a Function).