Skip to content

Running on batch systems

Every batch system has specific configurations. However the setup for some common systems is described below, at least as an example to build your own submission scripts and/or adapt to your specific needs.

Snowmass Connect (snowmass.io)

OSG-provided cluster used in the context of snowmass and still available for tutorials and small-scale work.

A minimal HTCondor submission script, myjob.submit, to the OSG is inlined below:

Universe = Vanilla
Executable   = run.sh
Requirements = ( ( HAS_SINGULARITY == TRUE ) && ( HAS_CVMFS_unpacked_cern_ch ) )
+SingularityImage = "/cvmfs/unpacked.cern.ch/registry.hub.docker.com/infnpd/mucoll-ilc-framework:1.6-centos8"
Error   = output.err.$(Cluster)-$(Process)
Output  = output.out.$(Cluster)-$(Process)
Log     = output.log.$(Cluster)
should_transfer_files = YES
WhenToTransferOutput = ON_EXIT
request_cpus = 1
request_memory = 5 GB
+ProjectName="snowmass21.energy"
Queue 1

When the HTCondor script above is submitted, you request a remote worker node with 1 core and 5 GB to run the run.sh executable. In this case, run.sh is a shell script that contains a list of commands that executes your workload on the worker node. For example:

!/bin/bash 
echo "Sourcing Setup"
source /opt/ilcsoft/init_ilcsoft.sh

echo "Retrieving needed input files not readily available"
wget  https://stash.osgconnect.net/collab/project/snowmass21/data/ilc/<something>

echo "Running reconstruction"
Marlin -h #change this to your specific command

You can find more information on the Snowmass21 Connect wiki.