AI::MXNet::AutoGrad - Autograd for NDArray.
Set status to training/not training. When training, graph will be constructed for gradient computation. Operators will also run with ctx.is_train=True. For example, Dropout will drop inputs randomly when is_train=True while simply passing through if is_train=False. Parameters ---------- is_train: bool Returns ------- previous state before this set.
Set status to recording/not recording. When recording, graph will be constructed for gradient computation. Parameters ---------- is_recoding: bool Returns ------- previous state before this set.
Get status on recording/not recording. Returns ------- Current state of recording.
Get status on training/predicting. Returns ------- Current state of training/predicting.
Mark AI::MXNet::NDArrays as variables to compute gradient for autograd. Parameters ---------- ArrayRef[AI::MXNet::NDArray] $variables ArrayRef[AI::MXNet::NDArray] $gradients GradReq|ArrayRef[GradReq] :$grad_reqs='write'
Compute the gradients of heads w.r.t previously marked variables. Parameters ---------- $heads: ArrayRef[AI::MXNet::NDArray] Output NDArray(s) :$head_grads=: Maybe[AI::MXNet::NDArray|ArrayRef[AI::MXNet::NDArray|Undef]] Gradients with respect to heads. :$retain_graph=0: bool, optional Whether to retain graph. :$train_mode=1: bool, optional Whether to do backward for training or predicting.
Compute the gradients of outputs w.r.t variables. Parameters ---------- outputs: array ref of NDArray Returns ------- gradients: array ref of NDArray
Return function that computes both gradient of arguments and loss value. Parameters ---------- func: a perl sub The forward (loss) function. argnum: an int or a array ref of int The index of argument to calculate gradient for. Returns ------- grad_and_loss_func: a perl sub A function that would compute both the gradient of arguments and loss value.
Compute the gradients of heads w.r.t variables. Gradients will be returned as new NDArrays instead of stored into `variable.grad`. Supports recording gradient graph for computing higher order gradients. .. Note: Currently only a very limited set of operators support higher order gradients. Parameters ---------- $heads: NDArray or array ref of NDArray Output NDArray(s) $variables: NDArray or list of NDArray Input variables to compute gradients for. :$head_grads=: NDArray or list of NDArray or undef Gradients with respect to heads. :$retain_graph=: bool Whether to keep computation graph to differentiate again, instead of clearing history and release memory. Defaults to the same value as create_graph. :$create_graph=0: bool Whether to record gradient graph for computing higher order $train_mode=1: bool, optional Whether to do backward for training or prediction. Returns ------- NDArray or list of NDArray: Gradients with respect to variables. Examples -------- >>> $x = mx->nd->ones([1]); >>> $x->attach_grad(); >>> mx->autograd->record(sub { $z = mx->nd->elemwise_add(mx->nd->exp($x), $x); }); >>> $dx = mx->autograd->grad($z, [$x], create_graph=>1) >>> $dx->backward(); >>> print($dx->grad->aspdl) [3.71828175]
Executes $sub within an autograd training scope context. Parameters ---------- CodeRef $sub: a perl sub
Executes $sub within an autograd predicting scope context. Parameters ---------- CodeRef $sub: a perl sub
Executes $sub within an autograd recording scope context and captures code that needs gradients to be calculated. Parameters ---------- CodeRef $sub: a perl sub Maybe[Bool] :$train_mode=1
Executes $sub within an autograd recording scope context and captures code that needs gradients to be calculated. Parameters ---------- CodeRef $sub: a perl sub Maybe[Bool] :$train_mode=0
Retrieve recorded computation history as `Symbol`. Parameters ---------- x : NDArray Array representing the head of computation graph. Returns ------- Symbol The retrieved Symbol.
To install AI::MXNet, copy and paste the appropriate command in to your terminal.
cpanm
cpanm AI::MXNet
CPAN shell
perl -MCPAN -e shell install AI::MXNet
For more information on module installation, please visit the detailed CPAN module installation guide.