Glenn

Glenn

Glenn is a computer cluster at C3SE.

The main page for information on the glenn cluster is
http://www.c3se.chalmers.se/index.php/Glenn

The nodes are Opteron 6220
http://en.wikipedia.org/wiki/List_of_AMD_Opteron_microprocessors#Opteron_4200-series_.22Valencia.22_.2832_nm.29

They probably look like this:
http://www.supermicro.nl/Aplus/system/1U/1042/AS-1042G-TF.cfm

AMD Opteron 6200 series compiler options quick reference guide:
http://www.c3se.chalmers.se/common/CompilerOptQuickRef-62004200.pdf

Glenn no. 5

The nuclear theory group has a private node to which we have exclusive access. Our node is named "glenn-quad1.int.private". Note that logically quad1 = 4+1=5, i.e. our node is Glenn no. 5
http://www.systembolaget.se/89147

We have exclusive access to this particular monster node (4 sockets, 32 cores, 512 GB). It's current work load can be monitored here (from within Chalmers):
http://www1.c3se.chalmers.se/ganglia/?m=load_one&r=hour&s=descending&c=Glenn&h=glenn-quad1.int.private&sh=1&hc=4&z=small

Running jobs, sample batch script

Login via ssh to the login node glenn.c3se.chalmers.se. Glenn uses the Slurm resource manager and scheduler. The recommended way of launching jobs on Glenn is via sbatch.

Some useful batch commands are listed below:

  • To check our queue.
    squeue -p subatom
  • To queue an interactive job,which starts an terminal on the full node. Note the time option, in this particular case the maximum time is set to 5 minutes.
    srun -p subatom -A C3SE999-12-1 -N 1 -n 32 -t 00:05:00 --pty bash -i

See further Slurm commands at http://www.c3se.chalmers.se/index.php/Glenn

The following is a sample batch script to perform a run on our node (Antoine run script by Daniel). Note in particular the project number (-A flag) and our partition (-p flag).

#!/bin/bash
#SBATCH -A C3SE999-12-1 # SNIC project number! 
#SBATCH -p subatom      #Subatom-node (Default is glenn)
#SBATCH -N 1 # number of nodes to request
#SBATCH -n 32 # total number of processes, 16xN (In our case:32xN)
#SBATCH -o myoutput-%N.out # %N for name of 1st allocated node
#SBATCH -t 00:10:00 # walltime limit
#SBATCH  --mail-type=END     # send mail when job ends (BEGIN, END, FAIL or ALL are possible options)
#SBATCH  --mail-user daniel.saaf@chalmers.se

module load pgi # load OpenMPI module

pdcp antoine_new_4GB_x86_64.exe $TMPDIR # copy files to node local disk
pdcp ~/ncsm/TBME/fort.9* $TMPDIR
mkdir $TMPDIR/cdm
pdcp ~/ncsm/cdm/* $TMPDIR/cdm
echo 'Tempdir'
echo $TMPDIR
ls -ll $TMPDIR

ln -sf  $TMPDIR/cdm cdm
ln -sf  $TMPDIR/fort.90 fort.90
ln -sf  $TMPDIR/fort.91 fort.91
ln -sf  $TMPDIR/fort.92 fort.92
ln -sf  $TMPDIR/fort.93 fort.93
./antoine_new_4GB_x86_64.exe < He6_plus_nmax6 
ls -ll
#cp $TMPDIR/fort.2* $SLURM_SUBMIT_DIR # copy output

See further sample scripts at
http://www.c3se.chalmers.se/index.php/Glenn

Note in particular that the generic Physics project, currently SNIC001-11-309, should also work on Glenn. For this project you can use -p glenn, and submit mpi jobs utilizing many nodes.

Compilers and libraries

Available compilers are: gcc, ICC and PGI. These are loaded using modules.
Recommended compilation flags for the Bulldozer architecture are given in: http://www.c3se.chalmers.se/common/CompilerOptQuickRef-62004200.pdf

  • gsl library is available.
  • AMD Core Math Library (ACML) is available. Routines, available via both FORTRAN 77 and C interfaces, include: BLAS, LAPACK, FFT, RNG.
    • Beware, that gsl also includes an implementation of blas, but this one is much slower than the ACML one.
  • See http://www.c3se.chalmers.se/index.php/Software_Glenn
The GSL library and CBLAS linked with ACML are available via modules. Look for
module avail | grep "gcc" 
module avail | grep "acml" 

The module with suffix "_mp" is preferred since it uses multiple threads.

And can then be used at compilation (with gsl/acml):
gcc $(gsl-config --libs-without-cblas) -lacmlcblas -lacml_mp main.o -o my_program
or without gsl:
gcc -L/c3se/apps/Glenn/gsl/1.15-gcc462/lib -lacmlcblas -lacml_mp main.o -o my_program
Some more comments and suggestions (in Swedish) is available here.

Account administration

  • Useful command to list project members
    sacctmgr list associations format=user,account%30s,partition