From: Axel Kohlmeyer (akohlmey_at_gmail.com)
Date: Sun Oct 04 2009 - 05:23:58 CDT
On Sat, 2009-10-03 at 14:18 -0700, Jorgen Simonsen wrote:
> Hi all,
hi jorgen,
[...]
> cd $PBS_O_WORKDIR
> ../../../Programs/NAMD/charmrun ++local ../../../Programs/NAMD/namd2
> +p$NPROCS min.conf > data.log
>
>
> it starts up 16 threads on one processor which is of course a waste.
of course it is not ideal, but it is what _you_ asked charmrun to do?
computers don't read minds, they do what they get told to do.
> If I remove the ++local and add ++verbose
> Charmrun> charmrun started...
> Charmrun> using /home/user/.nodelist as nodesfile
> Charmrun> remote shell (localhost:0) started
> Charmrun> remote shell (localhost:1) started
> Charmrun> remote shell (localhost:2) started
> Charmrun> remote shell (localhost:3) started
> Charmrun> node programs all started
> connect to address 127.0.0.1: Connection refused
> connect to address 127.0.0.1: Connection refused
> trying normal rsh (/usr/bin/rsh)
> connect to address 127.0.0.1: Connection refused
[...]
> What is wrong and how can I fix this. Thanks in advance
there is nothing wrong with that either; it is how it would
behave when this command line is used. the main problem here
is that you didn't read the documentation on how to run namd
in parallel. for example what is said in the notes.txt file
about running on linux/unix machines.
you have to construct a proper node file for the nodes that
are assigned to you by the batch system and your system also
doesn't allow to rsh into a node without a password, so you
have to fix that as well (either by telling charmrun to use
ssh and set up passwordless ssh access, or having the rsh
setup fixed). i am certain that if you search the mailing
list archives or the namd wiki, you'll find more detailed
examples and explanations.
cheers,
axel.
>
> Best
>
>
> Jorgen
>
>
-- Dr. Axel Kohlmeyer akohlmey_at_gmail.com Institute for Computational Molecular Science College of Science and Technology Temple University, Philadelphia PA, USA.
This archive was generated by hypermail 2.1.6 : Wed Feb 29 2012 - 15:53:20 CST