AImageLab-HPC

Using Python

Last updated: April 3, 2026


Python is available on AImageLab-HPC through environment modules. This page explains how to set up and manage Python environments.

Available Python Versions

python/3.11.11-gcc-11.4.0 is loaded by default at login. Additional Python versions are available as modules:

module load python/3.9.21-gcc-11.4.0
module load python/3.10.16-gcc-11.4.0
module load python/3.11.11-gcc-11.4.0   # default

Anaconda is also available but must be loaded explicitly:

module load anaconda3

Virtual Environments

Virtual environments created with venv are strongly preferred over Conda environments: Conda environments can easily grow to several gigabytes, while venv environments are lightweight and only contain the packages you explicitly install.

Creating a virtual environment

Load the Python version you need:

module load python/3.11.11-gcc-11.4.0

Create the virtual environment:

python -m venv /path/to/my_env

Note: Choose an arbitrary name for your environment. Create your virtual environments inside /homes/<username> (e.g. /homes/<username>/envs/my_env). NFS handles the many small files that make up a venv better than BeeGFS (/work), which is optimised for large sequential reads.

Activating and installing packages

source /path/to/my_env/bin/activate
pip install <package>

Once activated, your shell prompt will show the environment name:

(my_env) user@ailb-login-02:~$

Deactivating

deactivate

Using Your Environment in SLURM Batch Jobs

Activate the virtual environment in your job script before running Python:

#!/bin/bash
#SBATCH --job-name my_job
#SBATCH --partition <partition>
#SBATCH --gres gpu:1

module load python/3.11.11-gcc-11.4.0   # same version used to create the env
source /path/to/my_env/bin/activate

python my_script.py

If your environment was created on top of Anaconda’s Python, load and initialise Conda first:

#!/bin/bash
#SBATCH --job-name my_job
#SBATCH --partition <partition>
#SBATCH --gres gpu:1

module load anaconda3
. /usr/local/anaconda3/etc/profile.d/conda.sh
source /path/to/my_env/bin/activate

python my_script.py

Jupyter Notebooks

To use your virtual environment as a Jupyter kernel, register it with ipykernel:

source /path/to/my_env/bin/activate
pip install ipykernel
python -m ipykernel install --user --name my_env --display-name "Python (my_env)"

For a full walkthrough of how to start a Jupyter session on a compute node and connect to it from your local browser, see Jupyter Notebook on a Compute Node.

Common Issues

  • Running out of disk space: Prefer venv over Conda environments. If your environment is very large, consider removing unused packages rather than moving it to /work.

  • Jupyter kernel not connecting: Make sure you registered the kernel from inside the correct activated environment, and that port forwarding is set up correctly.

  • Model downloads filling up storage: When loading pretrained models (e.g. via Hugging Face from_pretrained()), set a custom cache directory to avoid filling your home:

    bash export HF_HOME=/path/to/large/storage/.cache/huggingface