The Perl Toolchain Summit needs more sponsors. If your company depends on Perl, please support this very important event.

NAME

    AI::MXNet::AutoGrad - Autograd for NDArray.

set_is_training

    Set status to training/not training. When training, graph will be constructed
    for gradient computation. Operators will also run with ctx.is_train=True. For example,
    Dropout will drop inputs randomly when is_train=True while simply passing through
    if is_train=False.

    Parameters
    ----------
    is_train: bool

    Returns
    -------
    previous state before this set.

set_is_recording

    Set status to recording/not recording. When recording, graph will be constructed
    for gradient computation.

    Parameters
    ----------
    is_recoding: bool

    Returns
    -------
    previous state before this set.

is_recording

    Get status on recording/not recording.

    Returns
    -------
    Current state of recording.

is_training

    Get status on training/predicting.

    Returns
    -------
    Current state of training/predicting.

mark_variables

    Mark AI::MXNet::NDArrays as variables to compute gradient for autograd.

    Parameters
    ----------
    ArrayRef[AI::MXNet::NDArray] $variables
    ArrayRef[AI::MXNet::NDArray] $gradients
    GradReq|ArrayRef[GradReq]   :$grad_reqs='write'

backward

    Compute the gradients of heads w.r.t previously marked variables.

    Parameters
    ----------
    $heads: ArrayRef[AI::MXNet::NDArray]
        Output NDArray(s)
    :$head_grads=: Maybe[AI::MXNet::NDArray|ArrayRef[AI::MXNet::NDArray|Undef]]
        Gradients with respect to heads.
    :$retain_graph=0: bool, optional
        Whether to retain graph.
    :$train_mode=1: bool, optional
        Whether to do backward for training or predicting.

compute_gradient

    Compute the gradients of outputs w.r.t variables.

    Parameters
    ----------
    outputs: array ref of NDArray

    Returns
    -------
    gradients: array ref of NDArray

grad_and_loss

    Return function that computes both gradient of arguments and loss value.

    Parameters
    ----------
    func: a perl sub
        The forward (loss) function.
    argnum: an int or a array ref of int
        The index of argument to calculate gradient for.

    Returns
    -------
    grad_and_loss_func: a perl sub
        A function that would compute both the gradient of arguments and loss value.

grad

    Compute the gradients of heads w.r.t variables. Gradients will be
    returned as new NDArrays instead of stored into `variable.grad`.
    Supports recording gradient graph for computing higher order gradients.

    .. Note: Currently only a very limited set of operators support higher order
    gradients.

    Parameters
    ----------
    $heads: NDArray or array ref of NDArray
        Output NDArray(s)
    $variables: NDArray or list of NDArray
        Input variables to compute gradients for.
    :$head_grads=: NDArray or list of NDArray or undef
        Gradients with respect to heads.
    :$retain_graph=: bool
        Whether to keep computation graph to differentiate again, instead
        of clearing history and release memory. Defaults to the same value
        as create_graph.
    :$create_graph=0: bool
        Whether to record gradient graph for computing higher order
    $train_mode=1: bool, optional
        Whether to do backward for training or prediction.

    Returns
    -------
    NDArray or list of NDArray:
        Gradients with respect to variables.

    Examples
    --------
    >>> $x = mx->nd->ones([1]);
    >>> $x->attach_grad();
    >>> mx->autograd->record(sub {
            $z = mx->nd->elemwise_add(mx->nd->exp($x), $x);
        });
    >>> $dx = mx->autograd->grad($z, [$x], create_graph=>1)
    >>> $dx->backward();
    >>> print($dx->grad->aspdl)
    [3.71828175]

train_mode

    Executes $sub within an autograd training scope context.
    Parameters
    ----------
    CodeRef $sub: a perl sub

predict_mode

    Executes $sub within an autograd predicting scope context.
    Parameters
    ----------
    CodeRef $sub: a perl sub

record

    Executes $sub within an autograd recording scope context
    and captures code that needs gradients to be calculated.
    Parameters
    ----------
    CodeRef $sub: a perl sub
    Maybe[Bool] :$train_mode=1

pause

    Executes $sub within an autograd recording scope context
    and captures code that needs gradients to be calculated.
    Parameters
    ----------
    CodeRef $sub: a perl sub
    Maybe[Bool] :$train_mode=0

get_symbol

    Retrieve recorded computation history as `Symbol`.

    Parameters
    ----------
    x : NDArray
        Array representing the head of computation graph.
    Returns
    -------
    Symbol
        The retrieved Symbol.