Development GPUs

From Wiki
Revision as of 12:04, 22 April 2023 by Administrator (talk | contribs) (Created page with "AImageLab provides development GPUs that can be used via remote debugging. Hosts containing development GPUs replicate the same configuration of all aimagelab-srv nodes. ==Access to development GPUs== All AImageLab-SRV users can access the development GPUs. The development resources are managed by SLURM, within the partitions <code>dev</code>and <code>students-dev</code>, respectively dedicated to the research staff (students, doctoral students, research fellows, collab...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

AImageLab provides development GPUs that can be used via remote debugging. Hosts containing development GPUs replicate the same configuration of all aimagelab-srv nodes.

Access to development GPUs

All AImageLab-SRV users can access the development GPUs. The development resources are managed by SLURM, within the partitions devand students-dev, respectively dedicated to the research staff (students, doctoral students, research fellows, collaborators) and students (thesis and non-thesis students).

Limits

Research staff: Up to 4 development GPUs can be requested at the same time. Each development session lasts a maximum of 4 hours.

Students: Up to 3 development GPUs can be requested at the same time. Each development session lasts a maximum of 2 hours.

Use from shell

⚠️ Warning : It is forbidden to access GPUs on development nodes without going through the SLURM scheduler.

To get a shell with a development GPU, use the following command from the login node or a development node:

Research staff: srun --partition=dev --gres=gpu:1 --pty bash

Students: srun --partition=students-dev --gres=gpu:1 --pty bash

Use from IDEs

Visual Studio Code

  • From the tab Extensions, install extensions Remote - SSHandRemote - SSH: Editing Configuration Files
  • Connect to aimagelab-srv-00.ing.unimore.it(e.g.: F1, Remote-SSH: Connect to Host...) and enter your project folder (e.g. via: F1, File: Open Folder...)
  • From the tab Extensions, on the remote host install the extensions PythonandPylance
  • Create the file $projectFolder/bash.sh, where $projectFolderthe project folder indicates, containing: Research staff: srun -Q --immediate=10 --partition=dev --gres=gpu:1 --time 60:00 --pty bash Students:srun -Q --immediate=10 --partition=students-dev --gres=gpu:1 --time 60:00 --pty bash
  • Check that the file $projectFolder/bash.shhas all execute permissions. Otherwise runchmod +x $projectFolder/bash.sh
  • Open (or create if non-existent) the file $projectFolder/.vscode/settings.json, and make sure it contains the following key-value pair

"terminal.integrated.automationProfile.linux": { "path": "$projectFolder/bash.sh" }