There is a newer version of autograd
Autograd can automatically differentiate native Python and Numpy code. It can handle
a large subset of Python's features, including loops, ifs, recursion and closures, and it can even
take derivatives of derivatives of derivatives. It supports reverse-mode differentiation
(a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions
with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be
composed arbitrarily. The main intended application of Autograd is gradient-based optimization.
Accessing autograd 1.5-foss-2021b
To load the module for autograd 1.5-foss-2021b please use this command on the BEAR systems (BlueBEAR, BEAR Cloud VMs, and CaStLeS VMs):
module load autograd/1.5-foss-2021b
BEAR Apps Version
For more information visit the autograd website.
This version of autograd has a direct dependency on:
This version of autograd is a direct dependent of:
These versions of autograd are available on the BEAR systems (BlueBEAR, BEAR Cloud VMs, and CaStLeS VMs). These will be retained in accordance with our Applications Support and Retention Policy.
Last modified on 30th November 2022