+p option on Windows 7

From: Kukol, Andreas (a.kukol_at_herts.ac.uk)
Date: Wed Jul 30 2014 - 09:56:28 CDT

Hello,

I obtained the precompiled namd binary for Windows and I am trying to run it using 4 CPU cores:

>c:\NAMD_2.9_Win64-MPI\namd2.exe +p4 ionized.namd > ionized.log

However, I get the following error message:
------------- Processor 0 Exiting: Called CmiAbort ------------
Reason: FATAL ERROR: Unknown command-line option +p4

Also including a space '+p 4' does not work. Namd2 without +p works fine, but a bit slow.

This is the info from 'notes.txt' that was included in the download:
Windows, Mac OX X (Intel), and Linux-x86_64-multicore released binaries
are based on "multicore" builds of Charm++ that can run multiple threads.
These multicore builds lack a network layer, so they can only be used on
a single machine. For best performance use one thread per processor
with the +p option:

  namd2 +p<procs> <configfile>

If I run namd2 without any input, I get the following info:

Charm++> Running on MPI version: 2.0
Charm++> level of thread support used: MPI_THREAD_SINGLE (desired: MPI_THREAD_SI
NGLE)
Charm++> Running on non-SMP mode
Converse/Charm++ Commit ID: v6.4.0-beta1-0-g5776d21
[0] isomalloc.c> Disabling isomalloc because mmap() does not work
CharmLB> Load balancer assumes all CPUs are same.
Charm++> Running on 1 unique compute nodes (4-way SMP).
Charm++> cpu topology info is gathered in 0.000 seconds.
Info: NAMD 2.9 for Win64-MPI
Info:
Info: Please visit http://www.ks.uiuc.edu/Research/namd/
Info: for updates, documentation, and support information.
Info:
Info: Please cite Phillips et al., J. Comp. Chem. 26:1781-1802 (2005)
Info: in all publications reporting results obtained with NAMD.
Info:
Info: Based on Charm++/Converse 60400 for mpi-win64
Info: Built Mon Apr 30 14:30:56 CDT 2012 by jcphill on cs-dexterity
Info: Running on 1 processors, 1 nodes, 1 physical nodes.
Info: CPU topology information available.
Info: Charm++/Converse parallel runtime startup completed at 0.016 s

Any help would be appreciated.

Many thanks
Andreas

This archive was generated by hypermail 2.1.6 : Wed Dec 31 2014 - 23:22:41 CST