Transformers 4.24.0-foss-2022a-CUDA-11.7.0

State-of-the-art Natural Language Processing for PyTorch and TensorFlow 2.0

Accessing Transformers 4.24.0-foss-2022a-CUDA-11.7.0

To load the module for Transformers 4.24.0-foss-2022a-CUDA-11.7.0 please use this command on the BEAR systems (BlueBEAR, BEAR Cloud VMs, and CaStLeS VMs):

module load bear-apps/2022a
module load Transformers/4.24.0-foss-2022a-CUDA-11.7.0

There is a CPU version of this module: Transformers 4.24.0-foss-2022a

BEAR Apps Version

2022a

Architectures

EL8-icelake (GPUs: NVIDIA A100, NVIDIA A30)

The listed architectures consist of two part: OS-CPU. The OS used is represented by EL and there are several different processor (CPU) types available on BlueBEAR. More information about the processor types on BlueBEAR is available on the BlueBEAR Job Submission page.

Extensions

  • accelerate 0.19.0
  • huggingface-hub-0.14.1
  • regex 2023.5.5
  • tokenizers 0.13.3
  • transformers 4.24.0

More Information

For more information visit the Transformers website.

Dependencies

This version of Transformers has a direct dependency on: CUDA/11.7.0 foss/2022a Python/3.10.4-GCCcore-11.3.0 PyTorch/1.12.1-foss-2022a-CUDA-11.7.0 PyYAML/6.0-GCCcore-11.3.0 SciPy-bundle/2022.05-foss-2022a tqdm/4.64.0-GCCcore-11.3.0

Other Versions

These versions of Transformers are available on the BEAR systems (BlueBEAR, BEAR Cloud VMs, and CaStLeS VMs). These will be retained in accordance with our Applications Support and Retention Policy.

Version BEAR Apps Version
4.24.0-foss-2022a 2022a

Last modified on 23rd May 2023