Transformers 4.24.0-foss-2022a
State-of-the-art Natural Language Processing for PyTorch and TensorFlow 2.0Accessing Transformers 4.24.0-foss-2022a
To load the module for Transformers 4.24.0-foss-2022a please use this command on the BEAR systems (BlueBEAR and BEAR Cloud VMs):
📋
module load bear-apps/2022a
module load Transformers/4.24.0-foss-2022a
There is a GPU enabled version of this module: Transformers 4.24.0-foss-2022a-CUDA-11.7.0
BEAR Apps Version
Architectures
EL8-cascadelake — EL8-icelake — EL8-sapphirerapids
The listed architectures consist of two part: OS-CPU. The OS used is represented by EL and there are several different processor (CPU) types available on BlueBEAR. More information about the processor types on BlueBEAR is available on the BlueBEAR Job Submission page.
Extensions
- accelerate 0.19.0
- huggingface-hub-0.14.1
- regex 2023.5.5
- tokenizers 0.13.3
- transformers 4.24.0
More Information
For more information visit the Transformers website.
Dependencies
This version of Transformers has a direct dependency on: foss/2022a Python/3.10.4-GCCcore-11.3.0 PyTorch/1.12.1-foss-2022a PyYAML/6.0-GCCcore-11.3.0 SciPy-bundle/2022.05-foss-2022a tqdm/4.64.0-GCCcore-11.3.0
Other Versions
These versions of Transformers are available on the BEAR systems (BlueBEAR and BEAR Cloud VMs). These will be retained in accordance with our Applications Support and Retention Policy.
Version | BEAR Apps Version |
---|---|
4.24.0-foss-2022a-CUDA-11.7.0 | 2022a |
Last modified on 23rd May 2023