You are here: Appendices > Appendix G: Prologue and Epilogue Scripts > Script Environment

G.3 Script Environment

The prologue and epilogue scripts can be very simple. On most systems, the script must declare the execution shell using the #!<SHELL> syntax (for example, "#!/bin/sh"). In addition, the script may want to process context sensitive arguments passed by Torque to the script.

In this topic:

G.3.1 Prologue Environment

The following arguments are passed to the presetup.prologue, prologue, prologue.user, and prologue.parallel scripts:

Argument Description
argv[1] Job id.
argv[2] Job execution user name.
argv[3] Job execution group name.
argv[4] Job name (Torque 1.2.0p4 and higher only).
argv[5] List of requested resource limits (Torque 1.2.0p4 and higher only).
argv[6] Job execution queue (Torque 1.2.0p4 and higher only).
argv[7] Job account (Torque 1.2.0p4 and higher only).
argv[8] Job script location.
argv[9] Comma-separated list of each host in the job. For example, if a job is using 10 cores on each of roshar, nalthis, elantris, and scadrial, this argument will have the value: roshar,nalthis,elantris,scadrial.

G.3.2 Epilogue Environment

Torque supplies the following arguments to the epilogue, epilogue.user, epilogue.precancel, and epilogue.parallel scripts:

Argument Description
argv[1] job id
argv[2] job execution user name
argv[3] job execution group name
argv[4] job name
argv[5] session id
argv[6] list of requested resource limits
argv[7] list of resources used by job
argv[8] job execution queue
argv[9] job account
argv[10] job exit code
argv[11] job script location

The epilogue.precancel script is run after a job cancel request is received by the MOM and before any signals are sent to job processes. If this script exists, it is run whether the canceled job was active or idle.

The cancel job command (qdel) will take as long to return as the epilogue.precancel script takes to run. For example, if the script runs for 5 minutes, it takes 5 minutes for qdel to return.

G.3.3 Environment Variables

For all scripts, the environment passed to the script is empty. When submitting a job through qsub or msub -E Torque defines variables.

G.3.3.A qsub

When submitting a job through qsub, Torque defines the following variables.

Variable Description
$PBS_MSHOST Mother superior's hostname
$PBS_RESOURCE_NODES -l nodes request made to the job, if any
$PBS_O_WORKDIR Job's working directory
$PBS_NODENUM Node index for the job of the node where this prologue or epilogue is executing
$PBS_NUM_NODES Number of nodes requested for the job (1 if no -l nodes request was made)
$PBS_NP

Number of execution slots used for the job

For example, -l nodes=2:ppn=4 will have $PBS_NP defined as 8.

$PBS_NUM_PPN

ppn request, if one was made

If more than one was made, it will be the first one. For example: -l nodes=2:ppn=3+4:ppn=2 will have this variable set to 3.

$PBS_NODEFILE Path to the job's nodefile

G.3.3.B msub -E

If you submit the job using msub -E, these Moab environment variables are available:

See msub in the Moab Workload Manager Administrator Guide for more information.

G.3.4 Standard Input

Standard input for both scripts is connected to a system dependent file. Currently, for all systems this is /dev/null.

Except for epilogue scripts of an interactive job, prologue.parallel, epilogue.precancel, and epilogue.parallel, the standard output and error are connected to output and error files associated with the job.

For an interactive job, since the pseudo terminal connection is released after the job completes, the standard input and error point to /dev/null.

For prologue.parallel and epilogue.parallel, the user will need to redirect stdout and stderr manually.

Related Topics 

© 2017 Adaptive Computing