Skip to content

Instantly share code, notes, and snippets.

@kyleabeauchamp
Last active August 29, 2015 14:06
Show Gist options
  • Save kyleabeauchamp/0dc457c22a13304476fc to your computer and use it in GitHub Desktop.
Save kyleabeauchamp/0dc457c22a13304476fc to your computer and use it in GitHub Desktop.
MPI Tests
-bash-4.1$ cat mpitest.o2365443
3,2,1
3,2,1
3,2,1
3,2,1
gpu-1-4.local
gpu-1-4.local
gpu-1-4.local
gpu-1-5.local
grep: /var/spool/torque/aux/2365443.mskcc-fe1.localgpu: No such file or directory
Set compute mode to DEFAULT for GPU 0000:84:00.0.
All done.
Set compute mode to DEFAULT for GPU 0000:83:00.0.
All done.
Set compute mode to DEFAULT for GPU 0000:04:00.0.
All done.
#!/bin/sh
# walltime : maximum wall clock time (hh:mm:ss)
#PBS -l walltime=000:01:99
#
# join stdout and stderr
#PBS -j oe
#
# spool output immediately
#PBS -k oe
#
# specify GPU queue
#PBS -q gpu
#
# nodes: number of nodes
# ppn: number of processes per node
# gpus: number of gpus per node
# GPUs are in 'exclusive' mode by default, but 'shared' keyword sets them to shared mode.
#PBS -l nodes=4:ppn=1:gpus=1:exclusive
#
# export all my environment variables to the job
# PBS -V
#
# job name (default = name of script file)
#PBS -N mpitest
#
# specify email for notifications
#PBS -M redacted@gmail.com
#
# mail settings (one or more characters)
# n: do not send mail
# a: send mail if job is aborted
# b: send mail when job begins execution
# e: send mail when job terminates
#PBS -m n
#
# filename for standard output (default = <job_name>.o<job_id>)
# at end of job, it is in directory from which qsub was executed
# remove extra ## from the line below if you want to name your own file
##PBS -o myoutput
# Change to working directory used for job submission
cd $PBS_O_WORKDIR
# start spark workers
mpirun -rmk pbs echo $CUDA_VISIBLE_DEVICES
mpirun -rmk pbs hostname
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment