Sie zeigen eine alte Version dieser Seite an. Zeigen Sie die aktuelle Version an.

Unterschiede anzeigen Seitenhistorie anzeigen

« Vorherige Version anzeigen Version 5 Nächste Version anzeigen »

You finally got an account on our HPC? Welcome! Here is a short guide on things you need to know before starting computations.

Login 

Our cluster can be reached via ssh, provided you are either within the university network or are connect via VPN. Under Windows, open the command line (cmd) or PowerShell, in Linux or on MacOS, open a terminal and type

ssh <$USER>@hpc.rz.uni-duesseldorf.de

Of course, replace <$USER> with your actual username. When you login for the first time, the systen may ask if you want to continue connecting, answer with "yes". When prompted, type in your password (you may possibly not see on the screen what you are typing, this is ok) and hit enter. You'll find yourself in the home directory of our login-host hpc-login7.

Please note that this node is only used to login, do not start computations or heavy tasks here - actual computations are performed on our compute nodes. 

You may also connect to our jupyterhub server via your webbrowser, https://jupyter.hpc.rz.uni-duesseldorf.de to start jupyter notebooks or terminal sessions.

File system

There are three important directories where you can store data, these are:

Your home directory /home/<$USER>                      -    this directory has a size limit of only 60 GB and should only be used to store small

                                                                                          files/programs/configuration data.

A project directory   /gpfs/project/<$USER>     -    this directory is used for project data, it has a maximum capcity of 10 TB per user and its contents

                                                                                         are backed up regularly

Temporary files        /gpfs/scratch/<$USER>    -    to store temporary files during computations. Max capacity is 20 TB, there is no backup and files

                                                                                         older than 60 days are deleted automatically.

These directories are accessible from the login node and all compute nodes. The /gpfs directories are connected to a fast parallel file system, these should be used to perform computations.

For the contents of /home and /gpfs/project regular snapshots are created, you can restore your data from these snapshots. You can check you current quotas for these directories  by loading the module "hpc-tools" on the login node and typing usage_report.py:

module load hpc-tools

usage_report.py

Transferring files to the HPC

Files can be transfered from your computer to the HPC and vice versa via scp, please user our storage server for this purpose, e.g.:

scp yourfile <$USER>@storage.hpc.rz.uni-duesseldorf.de/gpfs/project/<$USER>

or use any scp client (FileZilla, WinSCP etc.)

You can also mount the above mentioned directories to your PC. Use sshfs under Linux or MacOS:

sshfs <$USER>@storage.hpc.rz.uni-duesseldorf.de/gpfs/project/$USER your_local_dir

Windows and MacOS users can also attach these directories as a network drive, see this page for further instructions

Submitting jobs

We use PBS professional for batch control and submission of jobs, you can start interactive jobs or use scripts to define your computation tasks.

Interactive tasks:  e.g.

qsub -I -l select=1:ncpus=1:mem=4GB -l walltime=02:00:00 -A $PROJECT

The select statement defines your required ressources, the above line means you want to access 1 compute note with 1 cpu and 4 Gigabye of RAM for 2 hours. Please do not forget to attach your project name $PROJECT with the -A switch.

You can also write a batch script and submit this, e.g.

#!/bin/bash

#PBS -l select=1:ncpus=1:mem=4GB

#PBS -l walltime=02:00:00

#PBS -A $PROJECT

module load Python/3.10.4

python myscript.py


and submit the script with  qsub youscript.sh. Check your jobs with qstat -u <$USER>, jobs can be cancelled with qdel JOBID, where JOBID is the number reported by qstat



  • Keine Stichwörter