There is a newer version of autogradAutograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization.
Accessing autograd 1.3-foss-2020a-Python-3.8.2
To load the module for autograd 1.3-foss-2020a-Python-3.8.2 please use this command on the BEAR systems (BlueBEAR, BEARCloud VMs, and CaStLeS VMs):
module load autograd/1.3-foss-2020a-Python-3.8.2
BEAR Apps Version
For more information visit the autograd website.
This version of autograd has a direct dependency on: foss/2020a Python/3.8.2-GCCcore-9.3.0 SciPy-bundle/2020.03-foss-2020a-Python-3.8.2
This version of autograd is a direct dependent of: Meep/1.17.1-foss-2020a-Python-3.8.2
These versions of autograd are available on the BEAR systems (BlueBEAR, BEARCloud VMs, and CaStLeS VMs). These will be retained in accordance with our Applications Support and Retention Policy.
|Version||BEAR Apps Version|
Last modified on 29th January 2021