...
- Apply for apply for membership in
plggcryospar
team in Portal PLGrid and ask for registration in Cyfronet's internal cryoSPARC users database and dedicated port for access to cryoSPARC master through Helpdesk PLGrid. Log in to Prometheus login node
Code Block language bash title Log into Prometheus login node ssh <login>@pro.cyfronet.pl
Load cryoSPARC module using command
Code Block language bash title Set cryoSPARC environment module add plgrid/apps/cryosparc/3.1
Run cryoSPARC configuration script. It will configure your cryoSPARC environment and create your user in cryoSPARC database and configure two lanes for external jobs -
prometheus-gpu
which is going to useplgrid-gpu
partition for GPU jobs andprometheus-gpu-v100
forplgrid-gpu-v100
partition. Both lanes are going to useplgrid
partition for CPU only jobs. As argument for script pass license id, your e-mail and password (they are going to be used to login to cryoSPARC webapp), your first and last name.Code Block language bash title Configure cryoSPARC cryosparc_configuration --license <XXXX> --email <your-email> --password <password> --firstname <Givenname> --lastname <Surname>
Info title Access problems In case of "
cryosparc_configuration: command not found
" error run in terminalCode Block language bash newgrp plggcryospar
to start new subshell with permissions of
plggcryospar
team.Info title Optional lanes/clusters You could create additional lanes/clusters for other maximal duration of SLURM job:
- copy cluster config
cluster_info.json
and script templatecluster_script.sh
from directory/net/software/local/cryosparc/3.1/cyfronet
to your working directory - modify files accordingly
- in config
cluster_info.json
change name of lane/cluster to avoid overwriting default prometheus lane - in
cluster_script.sh
change--time
,--partition
or other parts of script template accordingly
- in config
- run command
cryosparcm cluster connect <name-of-cluster-form-cluster_info.json>
to add lane/cluster - repeat above points to create another lane if nessesary
Info title Access to GPU partitions To use GPUs on Promehteus cluster you have to apply for GPU resources at Portal PLGrid.
To check whether you have an access to partition on Prometheus login node run below command and check whether your accounts are on AllowAccounts list
partition plgrid-gpu
Code Block language bash scontrol show partition plgrid-gpu-v100| grep Accounts
partition plgrid-gpu-v100
Code Block language bash scontrol show partition plgrid-gpu-v100| grep Accounts
If you do not have acccess access to one of both above partitions please contact Helpdesk PLGrid
- copy cluster config
- Your cryoSPARC master setup already done. All succeeding crypoSPARC master instances should be run in batch jobs.
...
Submit job
Code Block language bash title job submision sbatch cryosparc-master.slurm
Warning title cryoSPARC master job There should be only one job which run cryoSPARC master in
plgrid-servicies
partition per user.Check whether job was started
Code Block language bash title jobs status squeue -j <JobID>
Common states of jobs
PD - PENDING
- Job is awaiting resource allocation.R - RUNNING
- Job currently has an allocation and is running.CF - CONFIGURING
- Job has been allocated resources, but are waiting for them to become ready for use (e.g. booting). On PrometheusCF
state could last for up to 8 minutes in case when nodes that have been in power save mode.CG - COMPLETING
- Job is in the process of completing. Some processes on some nodes may still be active.
Make a tunnel
In your directory cat job log file:
Code Block language bash cat cryosparc-master-log-<JobID>.txt
where `
XXXXXXX
` is your sbatch job id which is displayed after you run it f.e. `cat cryosparc-master-log-49145683.txt
`It will show you something like this:
Code Block language bash title Example of job log Copy/Paste this in your local terminal to ssh tunnel with remote ----------------------------------------------------------------- ssh -o ServerAliveInterval=300 -N -L 40100:172.20.68.193:40100 plgusername@pro.cyfronet.pl ----------------------------------------------------------------- Then open a browser on your local machine to the following address ------------------------------------------------------------------ localhost:48511 ------------------------------------------------------------------
Exec in another shell at your local computer given command to make a tunnel:
Code Block language bash title Tunneling ssh -o ServerAliveInterval=300 -N -L 40100:172.20.68.193:40100 plgusername@pro.cyfronet.pl
- Log into cryoSPARK web application - open in browser: `
localhost:40100
`
...