Table of Contents |
---|
Python Jupyter notebooks are usually started on a localhost, which results in starting local webserver and using web browser to interact with the notebook.
On Ares we cannot easily expose the web socket to external world, as calculations are done internally on a computing node, not visible from Internet.
The trick is to start Jupyter via a job submitted to a computing node and creating a SSH tunnel to access it on a local PC.
...
Create a following file:
Code Block | ||||
---|---|---|---|---|
| ||||
#!/bin/bash #SBATCH --partition plgrid #SBATCH --nodes 1 #SBATCH --ntasks-per-node 6 #SBATCH --time 0:30:00 #SBATCH --job-name jupyter-notebook-tunnel #SBATCH --output jupyter-log-%J.txt ## get tunneling info XDG_RUNTIME_DIR="" ipnport=$(shuf -i8000-9999 -n1) ipnip=$(hostname -i) user=$USER ## print tunneling instructions to jupyter-log-{jobid}.txt echo -e " Copy/Paste this in your local terminal to ssh tunnel with remote ----------------------------------------------------------------- ssh -o ServerAliveInterval=300 -N -L $ipnport:$ipnip:$ipnport ${user}@ares.cyfronet.pl ----------------------------------------------------------------- Then open a browser on your local machine to the following address ------------------------------------------------------------------ localhost:$ipnport (prefix w/ https:// if using password) ------------------------------------------------------------------ " module load jupyterlab/3.1.6-gcccore-11.2.0 scipy-bundle/2021.10-intel-2021b ## start an ipcluster instance and launch jupyter server jupyter-notebook --no-browser --port=$ipnport --ip=$ipnip |
...
Info | ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||||||
In case you want to start JupyterLab just change last line in above
to
|
Info | ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||||||
To use GPUs in your Jupyer simply add the proper flag to job requirements
or
|
Save it as pyton-notebook.slurm
.
...
PD - PENDING
- Job is awaiting resource allocation.R - RUNNING
- Job currently has an allocation and is running. CF - CONFIGURING
- Job has been allocated resources, but are waiting for them to become ready for use (e.g. booting). On Ares CF
state could last for up to 8 minutes in case when nodes that have been in power save mode. CG - COMPLETING
- Job is in the process of completing. Some processes on some nodes may still be active.In your directory cat jupyter log file:
...
Code Block | ||
---|---|---|
| ||
ssh -o ServerAliveInterval=300 -N -L 8511:172.20.68.193:8511 plgusername@ares.cyfronet.pl |
Open in browser: `localhost:8511
`
...
All informations from jupyter will be stored in this log file.
if you wish to to end your sbatch, use scancel <JOBID>
command, where JOBID is your tunnel JOBID you can look it up with hpc-jobs
or qsueue -u $USER
commands.
Code Block |
---|
scancel <JOBID> |
To check submitted and running jobs use hpc-jobs
or qsueue -u $USER
commands.
...