autograd 1.4-foss-2021a
There is a newer install of autograd
Autograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization.Accessing autograd 1.4-foss-2021a
To load the module for autograd 1.4-foss-2021a please use this command on the BEAR systems (BlueBEAR and BEAR Cloud VMs):
📋
module load autograd/1.4-foss-2021a
BEAR Apps Version
Architectures
EL8-cascadelake — EL8-icelake — EL8-sapphirerapids
The listed architectures consist of two part: OS-CPU. The OS used is represented by EL and there are several different processor (CPU) types available on BlueBEAR. More information about the processor types on BlueBEAR is available on the BlueBEAR Job Submission page.
More Information
For more information visit the autograd website.
Dependencies
This version of autograd has a direct dependency on: foss/2021a Python/3.9.5-GCCcore-10.3.0 SciPy-bundle/2021.05-foss-2021a
Required By
This version of autograd is a direct dependent of: Meep/1.23.0-foss-2021a
Other Versions
These versions of autograd are available on the BEAR systems (BlueBEAR and BEAR Cloud VMs). These will be retained in accordance with our Applications Support and Retention Policy.
Version | BEAR Apps Version |
---|---|
1.6.3-foss-2022b | 2022b |
1.5-foss-2021b | 2021b |
1.3-foss-2020a-Python-3.8.2 | 2020a |
Last modified on 13th July 2022