NAMD/MIC & CUDA on Stampede

From: Jeffery Klauda (jbklauda_at_umd.edu)
Date: Wed Jul 16 2014 - 13:28:27 CDT

I would like to run NAMD optimized for MIC and CUDA on Stampede. I have
been able to use the provided scripts in the following directory:

/home1/00288/tg455591/NAMD_scripts

For example, runbatch_latest_mic runs fine. The issue is I would like
to use my own queue submission script and have not been able to get it
to work. I have tried scripts in CSH and BASH with no luck. I am
confused why runbatch_latest_mic works but when I run the script below
it outputs the following error:

TACC: Starting up job 3694819
TACC: Setting up parallel environment for MVAPICH2+mpispawn.
TACC: Starting parallel tasks...
Charmrun: Bad initnode data length. Aborting
Charmrun> IBVERBS version of charmrun

**Submission Script**
#!/bin/csh
#SBATCH -J emre16 # Job name
#SBATCH -o emre.o%j # output and error file name (%j expands to
jobID)
#SBATCH -N 1 # total number of nodes
#SBATCH -n 16 # total number of mpi tasks requested
#SBATCH -p normal-mic # queue (partition) -- normal, development, etc.
#SBATCH -t 02:00:00 # run time (hh:mm:ss)

module load fftw3

# Set variables per Schulten Lab /home1/00288/tg455591/NAMD_scripts
set BINDIR =
/work/00288/tg455591/NAMD_build.latest/NAMD_2.9_Linux-x86_64-ibverbs-smp-Stampede-MIC

set SCRIPTDIR = /home1/00288/tg455591/NAMD_scripts

@ REALNUMPROCS = 1 * 15
@ NUMTASKS = 1

echo Running on host `hostname`
echo Time is `date`
echo Directory is `pwd`

# Execution of the script

$BINDIR/charmrun +p$REALNUMPROCS ++ppn 15 ++mpiexec ++remote-shell
$SCRIPTDIR/mpiexec $BINDIR/namd2 +commap 0 +pemap 1-15 +devices 0
dyn.inp >& dyn2.out

Jeff

-- 
*****************************************************
Jeffery B. Klauda
Associate Professor
Department of Chemical and Biomolecular Engineering
2113 Building 90
University of Maryland
College Park, MD 20742
Phone: (301) 405-1320
Fax: (301) 314-9126
Office: 1227A Chemical & Nuclear Engineering Bldg.
Web: terpconnect.umd.edu/~jbklauda
*****************************************************

This archive was generated by hypermail 2.1.6 : Thu Dec 31 2015 - 23:21:00 CST