(Click to open topic with navigation)
If you intend to use Torque Resource Manager 6.0.2 with Moab Workload Manager, you must run Moab version 8.0 or later. However, some Torque 6.0 functionality requires Moab 9.0 or later.
This topic contains instructions on how to install, configure, and start Torque Resource Manager (Torque).
For Cray systems, Adaptive Computing recommends that you install Moab and Torque Servers (head nodes) on commodity hardware (not on Cray compute/service/login nodes).
However, you must install the Torque pbs_mom daemon and Torque client commands on Cray login and "mom" service nodes since the pbs_mom must run on a Cray service node within the Cray system so it has access to the Cray ALPS subsystem.
See Installation Notes for Moab and Torque for Cray in the Moab Workload Manager Administrator Guide for instructions on installing Moab and Torque on a non-Cray server.
In this topic:
In this section:
Torque requires certain ports to be open for essential communication.
For more information on how to configure the ports that Torque uses for communication, see Configuring Ports in the Torque Resource Manager Administrator Guide for more information.
If you have a firewall enabled, do the following:
[root]# iptables-save > /tmp/iptables.mod [root]# vi /tmp/iptables.mod # Add the following line immediately *before* the line matching # "-A INPUT -j REJECT --reject-with icmp-host-prohibited" -A INPUT -p tcp --dport 15001 -j ACCEPT [root]# iptables-restore < /tmp/iptables.mod [root]# service iptables save
[root]# iptables-save > /tmp/iptables.mod [root]# vi /tmp/iptables.mod # Add the following lines immediately *before* the line matching # "-A INPUT -j REJECT --reject-with icmp-host-prohibited" -A INPUT -p tcp --dport 15002:15003 -j ACCEPT [root]# iptables-restore < /tmp/iptables.mod [root]# service iptables save
On the Torque Server Host, confirm your host (with the correct IP address) is in your /etc/hosts file. To verify that the hostname resolves correctly, make sure that hostname and hostname -f report the correct name for the host.
You must complete the prerequisite tasks earlier in this topic before installing the Torque Server. See 3.6.1 Prerequisites.
On the Torque Server Host, do the following:
[root]# yum install moab-torque-server
[root]# . /etc/profile.d/torque.sh
Example:
[root]# vi /var/spool/torque/server_priv/nodes node01 np=16 node02 np=16 ...
[root]# service pbs_server start [root]# service trqauthd start
In most installations, you will install a Torque MOM on each of your compute nodes.
Do the following:
[root]# scp RPMs/moab-torque-common-*.rpm <torque-mom-host>: [root]# scp RPMs/moab-torque-mom-*.rpm <torque-mom-host>: [root]# scp RPMs/moab-torque-client-*.rpm <torque-mom-host>:
[root]# ssh root@<torque-mom-host>
[root]# yum install moab-torque-common-*.rpm moab-torque-mom-*.rpm moab-torque-client-*.rpm
[root]# echo <torque_server_hostname> > /var/spool/torque/server_name
[root]# vi /var/spool/torque/mom_priv/config
$pbsserver <torque_server_hostname> # hostname running pbs server
$logevent 225 # bitmap of which events to log
[root]# service pbs_mom start
[root]# service trqauthd start
3.6.4 Configure Data Management
When a batch job completes, stdout and stderr files are generated and placed in the spool directory on the master Torque MOM Host for the job instead of the submit host. You can configure the Torque batch environment to copy the stdout and stderr files back to the submit host. See Configuring Data Management in the Torque Resource Manager Administrator Guide for more information.
Related Topics