From: Gengbin Zheng (gzheng_at_ks.uiuc.edu)
Date: Thu Dec 30 2004 - 22:50:45 CST
Hi,
Most of these gcc problem with templates and gcc3.4 have been fixed in
the latest version. For build on opteron, you will have to use "opteron"
option because it sets up several configurations for compiling on
opteron, NAMD won't run even you can compile the binary.
I am wondering if there is any additional error messages before the
messages you patesed below, I mean the error message given by the actual
gcc compiler? Could it because of not compiling object file in 64 bits,
use command 'file foo.o' to verify. If this is the reason, use -m64
compiler flag to generate 64 bit obj files.
New version of NAMD has Linux-amd64-MPI.arch for compiling with
Opteron and MPI.
The content of this file is:
NAMD_ARCH = Linux-amd64
CHARMARCH = mpi-linux-opteron
CXX = g++ -DSOCKLEN_T=socklen_t -DNO_STRSTREAM_H
CXXOPTS = -O3 -m64 -fexpensive-optimizations -ffast-math
CC = gcc
COPTS = -O3 -m64 -fexpensive-optimizations -ffast-math
You need to change CHARMARCH to your charm version compiled.
Gengbin
Damon Smith wrote:
>Hi all,
>
>Sorry for this really long email, but as the subject suggests, I had a
>go at compiling namd 2.5 on opteron with mpich 1.2.6 and myrinet, using
>gcc 3.4. I haven't found good docs on this, I've only seen docs for
>net-linux with amd64 and mpi, but no myrinet.
>
>A lot of the C++ code in charm++ and namd won't compile because of
>gcc3.4's changes to the way templates work, but it's a simple fix, I can
>post a patch if anyone is interested. The details on the gcc problem
>are here:
>http://www.dis.com/gnu/gcc/Name-lookup.html
>(it won't break the code for old compilers either)
>
>
>Once I got past that, and a lot more mucking around, it's hard to work
>out what to do. Charm++ won't compile with:
>mpi-linux opteron gm
>
>It gets the following error:
>Fatal Error by charmc in
>directory /home/san02/damon/tmp/charm-5.8/mpi-linux-gm-opteron/tmp
> Command mpicc -rdynamic -o ../lib_so/libconv-core.so -shared
>-L/home/san02/damon/tmp/charm-5.8/mpi-linux-gm-opteron/lib_so
>convcore.o conv-conds.o queueing.o msgmgr.o cpm.o cpthreads.o futures.o
>cldb.o topology.o random.o debug-conv.o generate.o edgelist.o conv-ccs.o
>ccs-builtins.o traceCore.o traceCoreCommon.o converseProjections.o
>machineProjections.o quiescence.o isomalloc.o global-nop.o returned
>error code 1
>charmc exiting...
>make[2]: *** [../lib/libconv-core.a] Error 1
>
>but it will compile with just
>mpi-linux gm
>
>and then namd doesn't appear to have an option for amd64 or opteron with
>mpi and myrinet. So I modified the i686-gm-mpi to use -march=opteron,
>and it compiles ok, but gets a linker error:
>
>collect2: ld returned 1 exit status
>Fatal Error by charmc in
>directory /home/san02/damon/tmp/NAMD_2.5_Source/Linux-i686-MPI
> Command mpiCC -rdynamic -O3 -march=opteron -ffast-math -static
>-L/home/san02/damon/tmp/fftw-3.0.1/lib
>-L/home/san02/damon/fftw-linux/lib
>-L/home/san02/damon/tmp/plugins//LINUXAMD64
>-L/home/san02/damon/tmp/plugins//LINUXAMD64/molfile
>-L/home/san02/damon/tmp/plugins/compile/LINUXAMD64
>-L/home/san02/damon/tmp/plugins/compile/LINUXAMD64/molfile
>-I/home/san02/damon/tmp/charm-5.8/mpi-linux-gm/include -o namd2
>-L/home/san02/damon/tmp/charm-5.8/mpi-linux-gm/bin/../lib
>-I/home/san02/damon/tmp/charm-5.8/mpi-linux-gm/bin/../include /home/san02/damon/tmp/charm-5.8/mpi-linux-gm/bin/../lib/libldb-rand.o obj/buildinfo.o obj/common.o obj/dcdlib.o obj/erf.o obj/main.o
>[..snip..]
>obj/pub3dfft.o obj/vmdsock.o obj/parm.o obj/imd.o moduleinit.o
>-lmoduleNeighborLB
>-lmodulecommlib /home/san02/damon/tmp/charm-5.8/mpi-linux-gm/bin/../lib/libmemory-default.o /home/san02/damon/tmp/charm-5.8/mpi-linux-gm/bin/../lib/libthreads-default.o -lck -lconv-cplus-y -lconv-core -lconv-util -lm -lckqt -lpmpich -lgm -lpthread -ldl -lz -lsrfftw -lsfftw -lmolfile_plugin -lm -lmoduleNeighborLB -lmodulecommlib returned error code 1
>charmc exiting...
>
>So that's where I give up. Those linker errors tell me nothing useful.
>Has anyone, anywhere got namd to compile with gcc, on opteron, using mpi
>and myrinet? If so how??
>
>I maybe should try getting the latest version of charm from CVS, as it
>appears to have changes to the amd64 code as of 6 days ago.
>
>Thanks,
>
>Damon
>
>
This archive was generated by hypermail 2.1.6 : Wed Feb 29 2012 - 15:39:04 CST