Transformers 4.39.3-foss-2023a

There is a newer install of Transformers

State-of-the-art Natural Language Processing for PyTorch and TensorFlow 2.0

Accessing Transformers 4.39.3-foss-2023a

To load the module for Transformers 4.39.3-foss-2023a please use this command on the BEAR systems (BlueBEAR and BEAR Cloud VMs):

📋 module load bear-apps/2023a
module load Transformers/4.39.3-foss-2023a

There is a GPU enabled version of this module: Transformers 4.39.3-foss-2023a-CUDA-12.1.1

BEAR Apps Version

2023a

Architectures

EL8-icelakeEL8-sapphirerapids

The listed architectures consist of two parts: OS-CPU. The OS used is represented by EL and there are several different processor (CPU) types available on BlueBEAR. More information about the processor types on BlueBEAR is available on the BlueBEAR Job Submission page.

Extensions

  • regex 2023.12.25
  • transformers 4.39.3

More Information

For more information visit the Transformers website.

Dependencies

This version of Transformers has a direct dependency on: foss/2023a Python/3.11.3-GCCcore-12.3.0 PyTorch/2.1.2-foss-2023a PyTorch-bundle/2.1.2-foss-2023a PyYAML/6.0-GCCcore-12.3.0 Safetensors/0.4.3-gfbf-2023a SciPy-bundle/2023.07-gfbf-2023a tokenizers/0.15.2-GCCcore-12.3.0 tqdm/4.66.1-GCCcore-12.3.0

Other Versions

These versions of Transformers are available on the BEAR systems (BlueBEAR and BEAR Cloud VMs). These will be retained in accordance with our Applications Support and Retention Policy.

Last modified on 17th March 2025