RE: problem running namd

From: Michel Espinoza-Fonseca (mef_at_ddt.biochem.umn.edu)
Date: Mon Apr 04 2005 - 16:48:48 CDT

Thank you very much for your comments. At the end I'll try to compile
namd for this specific machine (anyway, we don't have a lot of them ;)).
I'll also try to link the libraries "statically". After all, it is not a
waste of time, because it is a nice program :)
Cheers,
Michel

-----Original Message-----
From: Jerry Ebalunode [mailto:jebalunode_at_uh.edu]
Sent: Monday, April 04, 2005 4:45 PM
To: Brian Bennion
Cc: Michel Espinoza-Fonseca; namd-l_at_ks.uiuc.edu
Subject: Re: namd-l: problem running namd

Hi Brian,
because programs linked dynamically are smaller in size, hence help in

saving local disk space. can u imagine the extra disk space it will
take to
get all binarys in an os like linux linked statically?

> Good comment. Why then work with dynamic libraries in the first
place??
>
> Brian
>
> On Mon, 4 Apr 2005, Jerry Ebalunode wrote:
> > One quick way to avoid problems with namd or other programs having
being
> > built and supported by different versions of intel compiler intel
> > compiler and their runtime libraries is to simply link the intel
> > libraries statically at the build time. This way your namd2 would
not
> > have problems with running on machines with variations of the
runtime
> > libraries.
> >
> > To do this use the flag " -static-libcxa "
> >
> > On Monday 04 April 2005 01:12 pm, Brian Bennion wrote:
> > > The problem is most likely a difference in mpi libraries that
namd2 was
> > > compiled with and the ones used at runtime on the larger machine.
> > >
> > > I have this issue all the time with icc8.0 and icc8.1
compilations.
> > >
> > > If you cant compile against the current libs on the larger
machine,
> > > then try exporting your LD_LIBRARY_PATH in your batch script.
> > >
> > > regards
> > > Brian
> > >
> > > On Mon, 4 Apr 2005, Michel Espinoza-Fonseca wrote:
> > > > Hi all,
> > > >
> > > > I compiled NAMD on an Altix machine. The executable ran very
well and
> > > > perform very good. I was really excited about running namd in a
> > > > bigger machine, but when I sent the job I got the following
message:
> > > >
> > > > Charm++> MPI timer is synchronized!
> > > > Info: NAMD 2.5 for Linux-ia64
> > > > Info:
> > > > Info: Please visit http://www.ks.uiuc.edu/Research/namd/
> > > > Info: and send feedback or bug reports to namd_at_ks.uiuc.edu
> > > > Info:
> > > > Info: Please cite Kale et al., J. Comp. Phys. 151:283-312 (1999)
> > > > Info: in all publications reporting results obtained with NAMD.
> > > > Info:
> > > > Info: Based on Charm++/Converse 0143163 for
> > > > mpi-linux-ia64-ifort-mpt-icc Info: Built Thu Mar 10 18:04:44 CST
2005
> > > > by espinoza on balt Info: Sending usage information to NAMD
> > > > developers via UDP. Sent data is: Info: 1 NAMD 2.5 Linux-ia64
18
> > > > altix.msi.umn.edu espinoza ./namd2-altix: relocation error:
> > > > ./namd2-altix: undefined symbol: _Locksyslock MPI:
MPI_COMM_WORLD
> > > > rank 0 has terminated without calling MPI_Finalize() MPI:
aborting
> > > > job
> > > >
> > > > And of course the job just crashed. I also executed it without a
> > > > input file, in order to test the binary, and I got the same
problem.
> > > > Do you have any suggestions on how can I solve the problem? Do I
need
> > > > to recompile namd?
> > > >
> > > > Thank you very much!
> > > > Peace
> > > > Michel
> > >
> > > ************************************************
> > > Brian Bennion, Ph.D.
> > > Bioscience Directorate
> > > Lawrence Livermore National Laboratory
> > > P.O. Box 808, L-448 bennion1_at_llnl.gov
> > > 7000 East Avenue phone: (925) 422-5722
> > > Livermore, CA 94550 fax: (925) 424-6605
> > > ************************************************
>
> ************************************************
> Brian Bennion, Ph.D.
> Bioscience Directorate
> Lawrence Livermore National Laboratory
> P.O. Box 808, L-448 bennion1_at_llnl.gov
> 7000 East Avenue phone: (925) 422-5722
> Livermore, CA 94550 fax: (925) 424-6605
> ************************************************

This archive was generated by hypermail 2.1.6 : Wed Feb 29 2012 - 15:39:18 CST