Single-node sbatch scripts

The majority of MX softwares can use a single-node via an sbatch script presented in detail at NSC Triolith pages or LUNARC Aurora pages. A list of MX software suitable for sbatch scripts are:

  • xdsme
  • autoproc
  • autosharp
  • buster
  • pipedream and rhofit
  • arcimboldo_lite
  • xia2 -dials or -3dii option
  • phaser via ccp4i/ccp4i2 or Phenix "save parameter file"
  • phenix mr_rosetta via Phenix "save parameter file"
  • phenix rosetta_refine via Phenix "save parameter file"
  • xds can also use multi-node
  • xdsapp can also use multi-node

A HPC-MX user may run the phenix-GUI and ccp4i2-GUI at the login node and run serial compute jobs at the login node such as mtz-file editing, reindex, search model generation, however parallel computing should be submitted to compute nodes by:

  • Phenix-save parameter file followed by submitting a sbatch script (mrage and rosetta_refine)
  • Phenix submit queue job from some wizards (autobuild and mr_rosetta)
  • ccp4i2-run on server option that we are trying to get slurm compatible.

Running XDS and its derivatives at a single-node

When writing a single-node sbatch script for XDS/XDSAPP/XDSGUI/autoPROC we still need to match

MAXIMUMNUMBER_OF_JOBS_ X MAXIMUMNUMBER_OF_PROCESSORS_ = number of nodes X cores per node = total number of cores

where number of nodes is 1 when using a single-node and cores per node being 16 at Triolith and 20 at Aurora.

XDS and its derivatives set MAXIMUM_NUMBER_OF_JOBS & MAXIMUM_NUMBER_OF_PROCESSORS by related keywords such as

  1. XDS and XDSGUI use MAXIMUM_NUMBER_OF_JOBS & MAXIMUM_NUMBER_OF_PROCESSORS in XDS.INP
  2. autoPROC autoPROC_XdsKeyword_MAXIMUM_NUMBER_OF_JOBS & autoPROC_XdsKeyword_MAXIMUM_NUMBER_OF_PROCESSORS in sbatch script
  3. XDSAPP GUI No. of jobs and No. of cpus in XDSAPP GUI
  4. XDSAPP script -j and -c in sbatch script
  5. XDSME can use a single-node only since XDSME are varying these parameters for each xds subroutine (INIT, COLSPOT, IDXREF, INTEGRATE, etc) that cannot be matched by the slurm allocation command in the beginning of the XDSME sbatch script

Example 1: single-node XDSAPP at Lunarc Aurora (20 cores) - see benchmarks and XDSAPP command-line options

A) open terminal window
B) edit xdsapp.script as below
#!/bin/sh
#SBATCH -t 0:30:00
#SBATCH --nodes=1 --exclusive
#SBATCH -A snic2018-3-xxx
#SBATCH --mail-type=ALL
#SBATCH --mail-user=name.surname@lu.se
module load XDSAPP
xdsapp --cmd \
--dir /lunarc/nobackup/users/mochma/test_suite_NSC/eiger/empty/presto/xdsapp_test \
-j 4 \
-c 5 \
-i /lunarc/nobackup/users/mochma/test_suite_NSC/eiger/empty/2015_11_10/insu6_1_data_000001.h5
C) submit job by
sbatch xdsapp.script

Example 2: Run single-node XDS making your own XDS.INP from scratch at LUNARC Aurora

A) module load generate_XDS.INP
    generate_XDS.INP insu6_1_master.h5
B) Edit XDS.INP planning for a single-node run at LUNARC Aurora (20 processors) i.e. by addding
  MAXIMUM_NUMBER_OF_JOBS=4
  MAXIMUM_NUMBER_OF_PROCESSORS=5
  LIB=/sw/pkg/presto/software/Neggia/1.0.1-goolf-PReSTO-1.7.20/lib/dectris-neggia.so
C) Create and edit xds.script as:
#!/bin/sh
#SBATCH -t 0:15:00
#SBATCH --nodes=1 --exclusive
#SBATCH -A snic2018-3-xxx
#SBATCH --mail-type=ALL
#SBATCH --mail-user=name.surname@lu.se
module load XDS
xds_par
D) submit job by:
sbatch xds.script

Example 3a: autoPROC with eiger data - wiki at NSC Triolith

#!/bin/sh
#SBATCH -t 1:00:00
#SBATCH --nodes=1 --exclusive
#SBATCH -A snic2018-3-xxx
#SBATCH --mail-type=ALL
#SBATCH --mail-user=name.surname@lu.se
module load autoPROC
process \
-h5 /proj/xray/users/x_marmo/test_suite_NSC/eiger/empty/2015_11_10/insu6_1_master.h5 \
autoPROC_XdsKeyword_LIB=/proj/xray/presto/software/Neggia/1.0.1-goolf-PReSTO-1.7.20/lib/dectris-neggia.so \
autoPROC_XdsKeyword_MAXIMUM_NUMBER_OF_JOBS=4 \
autoPROC_XdsKeyword_MAXIMUM_NUMBER_OF_PROCESSORS=4 \
-d pro1 > pro1.log 

Example 3b: autoPROC with eiger data - wiki at LUNARC Aurora

#!/bin/sh
#SBATCH -t 0:30:00
#SBATCH --nodes=1 --exclusive
#SBATCH -A snic2018-3-xxx
#SBATCH --mail-type=ALL
#SBATCH --mail-user=name.surname@lu.se
module load autoPROC
process \
-h5 /lunarc/nobackup/users/mochma/test_suite_NSC/eiger/empty/2015_11_10/insu6_1_master.h5 \
autoPROC_XdsKeyword_LIB=/sw/pkg/presto/software/Neggia/1.0.1-goolf-PReSTO-1.7.20/lib/dectris-neggia.so \
autoPROC_XdsKeyword_MAXIMUM_NUMBER_OF_JOBS=4 \
autoPROC_XdsKeyword_MAXIMUM_NUMBER_OF_PROCESSORS=5 \
-d pro1 > pro1.log 

Example 3c: autoPROC with cbf files - wiki at LUNARC Aurora

#!/bin/sh
#SBATCH -t 0:40:00
#SBATCH --nodes=1 --exclusive
#SBATCH -A snic2018-3-xxx
#SBATCH --mail-type=ALL
#SBATCH --mail-user=name.surname@lu.se
module load autoPROC
process \
-Id id1,/lunarc/nobackup/users/name/test/pk,test_pk_1_#####.cbf,1,3600 \
-noANO -B \
autoPROC_XdsKeyword_MAXIMUM_NUMBER_OF_JOBS=4 \
autoPROC_XdsKeyword_MAXIMUM_NUMBER_OF_PROCESSORS=5 \
-d pro3 > pro3.log 

Single-node sbatch script examples

Here we share a number of examples of single-node sbatch scripts for many MX softwares running on NSC Triolith and Lunarc Aurora.

Example 4a: xia2 dials using ccp4 software with eiger data.

#!/bin/sh
#SBATCH -t 1:00:00
#SBATCH -N 1 --exclusive
#SBATCH -A snic2018-3-XXX
#SBATCH --mail-type=ALL
#SBATCH --mail-user=name.surname@lu.se
module load CCP4
xia2 pipeline=dials failover=true \
image=/proj/xray/users/x_marmo/test_suite_NSC/eiger/empty/2015_11_10/insu6_1_master.h5 \
multiprocessing.mode=serial \
multiprocessing.njob=1 \
multiprocessing.nproc=auto \
trust_beam_centre=False read_all_image_headers=False \

Example 4b: xia2 -dials using standalone dials with eiger data:

#!/bin/sh
#SBATCH -t 1:00:00
#SBATCH -N 1 --exclusive
#SBATCH -A snic2018-3-XXX
#SBATCH --mail-type=ALL
#SBATCH --mail-user=name.surname@lu.se
module load DIALS
xia2 pipeline=dials failover=true \
image=/proj/xray/users/x_marmo/test_suite_NSC/eiger/empty/2015_11_10/insu6_1_master.h5 \
multiprocessing.mode=serial \
multiprocessing.njob=1 \
multiprocessing.nproc=auto \
trust_beam_centre=False read_all_image_headers=False \

Example 4c: xia2 -xds with eiger data using CCP4

#!/bin/sh
#SBATCH -t 1:00:00
#SBATCH -N 1 --exclusive
#SBATCH -A snic2018-3-XXX
#SBATCH --mail-type=ALL
#SBATCH --mail-user=name.surname@lu.se
module load CCP4
xia2 pipeline=3dii failover=true \
image=/proj/xray/users/x_marmo/test_suite_NSC/eiger/empty/2015_11_10/insu6_1_master.h5 \
multiprocessing.mode=serial \
multiprocessing.njob=1 \
multiprocessing.nproc=16 \
trust_beam_centre=False read_all_image_headers=False \

Example 5: XDSme with eiger data

#!/bin/sh
#SBATCH -t 00:30:00
#SBATCH -N 1 --exclusive
#SBATCH -A snic2018-3-xxx
#SBATCH --mail-type=ALL
#SBATCH --mail-user=name.surname@lu.se
module load XDSME/0.6.0-PReSTO
xdsme /lunarc/nobackup/users/mochma/test_suite_NSC/eiger/empty/2015_11_10/insu6_1_??????.h5

Example 6: autoSHARP - see wiki or run a test job

#!/bin/sh
#SBATCH -t 8:00:00
#SBATCH -N 1 --exclusive
#SBATCH -A snic2018-3-XXX
#SBATCH --mail-type=ALL
#SBATCH --mail-user=name.surname@lu.se
module load SHARP
run_autoSHARP.sh \
-seq sequence.pir -ha "Zn" \
-nsit 20 \
-wvl 1.28334 peak -mtz pk.mtz \
-wvl 1.27838 infl -mtz ip.mtz \
-wvl 0.91841 hrem -mtz rm.mtz \
-id MADjob1 | tee MADjob1.lis

Example 7: sbatch script for arcimboldo_lite

Arcimboldo_lite (1) is for high resolution ab initio phasing and 4eto was used for benchmarking (2)

1. ARCIMBOLDO_LITE: single-workstation implementation and use.
Sammito M, Millán C, Frieske D, Rodríguez-Freire E, Borges R, Usón I
Acta Crystallogr. D Biol. Crystallogr. 2015 Sep;71(Pt 9):1921-30
2. Macromolecular ab initio phasing enforcing secondary and tertiary structure.
Millán C, Sammito M, Usón I
IUCrJ 2015 Jan;2(Pt 1):95-105

Benchmarking 4eto data-files can be downloaded and executed via sbatch arcimboldo.script

#!/bin/sh
#SBATCH -t 6:00:00
#SBATCH -N 1 --exclusive
#SBATCH -A snic2018-3-XXX
#SBATCH --mail-type=ALL
#SBATCH --mail-user=name.surname@lu.se
module load CCP4

job=4eto
ARCIMBOLDO_LITE ${job}.bor > ${job}.log

To run your own job please modify the arcimboldo.script above and 4eto.bor file according to arcimboldo manual

# 4eto.bor - start 
[CONNECTION]:
distribute_computing: multiprocessing
working_directory: /proj/xray/users/x_marmo/run_test_suite_NSC/arcimboldo/4eto.bor

[GENERAL]
working_directory = /proj/xray/users/x_marmo/run_test_suite_NSC/arcimboldo
mtz_path = ./4eto_2.mtz
hkl_path = ./4eto_4.hkl

[ARCIMBOLDO]
shelxe_line = -m30 -v0 -a3 -t20 -q -s0.55
helix_length = 14
sigf_label = SIGF
molecular_weight = 22000
name_job = 4eto_def
f_label = F
fragment_to_search = 2 
number_of_component = 1
identity = 0.2

[LOCAL]
# Third party software paths at NSC Triolith
path_local_phaser: /proj/xray/presto/software/CCP4/7.0.044-goolf-PReSTO-1.7.20-SHELX-ARP-7.6/ccp4-7.0/bin/phaser
path_local_shelxe: /proj/xray/presto/software/CCP4/7.0.044-goolf-PReSTO-1.7.20-SHELX-ARP-7.6/ccp4-7.0/bin/shelxe
# 4eto.bor - end

Example 8: buster using sbatch scripts - see wiki and tutorial

#!/bin/sh
#SBATCH -t 1:00:00
#SBATCH -N 1 
#SBATCH -n 8
#SBATCH -A snic2018-3-XXX
#SBATCH --mail-type=ALL
#SBATCH --mail-user=name.surname@lu.se
module load BUSTER
refine -p model.pdb \
-m data.mtz \
-l chiral.dat \
-l grade-AME.cif \
-Gelly tweak4.gelly \
-nthreads 8 \
-autoncs \
-TLS \
AutomaticFormfactorCorrection=yes \
StopOnGellySanityCheckError=no \
-d run1 > run1.log

Example 9: pipedream using sbatch scripts - see tutorial

#!/bin/sh
#SBATCH -t 2:00:00
#SBATCH -N 1
#SBATCH -n 8
#SBATCH -A snic2018-3-XXX
#SBATCH --mail-type=ALL
#SBATCH --mail-user=name.surname@lu.se
module load BUSTER
pipedream -imagedir /proj/xray/users/x_marmo/proteinA/x003/col \
-l grade-LIG.cif \
-xyzin apo_structure.pdb \
-nofreeref \
-nthreads 8 \
-d pipe_LIG_PReSTO > pipe_LIG_PReSTO.log

Example 10: rhofit using sbatch scripts -see tutorial

#!/bin/sh
#SBATCH -t 1:00:00
#SBATCH -N 1
#SBATCH -n 8
#SBATCH -A snic2018-3-XXX
#SBATCH --mail-type=ALL
#SBATCH --mail-user=name.surname@lu.se
module load BUSTER
rhofit -l grade-LIG.cif    \
-p pipe_LIG_PReSTO/refine/refine.pdb \
-m pipe_LIG_PReSTO/refine/refine.mtz \
-d rhofit_1 > rhofit1.log

User Area

User support

Guides, documentation and FAQ.

Getting access

Applying for projects and login accounts.

System status

Everything OK!

No reported problems

Self-service

SUPR
NSC Express