Re: I: Running Charm++ over MPI does not work

From: Chris Harrison (char_at_ks.uiuc.edu)
Date: Thu Jan 22 2009 - 10:53:50 CST

Andrea,

I suggest emailing the charm++ developers at ppl_at_cs.uiuc.edu since your
problem appears to be exclusively a charm++ issue.

C.

-- 
Chris Harrison, Ph.D.
Theoretical and Computational Biophysics Group
NIH Resource for Macromolecular Modeling and Bioinformatics
Beckman Institute for Advanced Science and Technology
University of Illinois, 405 N. Mathews Ave., Urbana, IL 61801
char_at_ks.uiuc.edu                            Voice: 217-244-1733
http://www.ks.uiuc.edu/~char               Fax: 217-244-6078
On Thu, Jan 22, 2009 at 8:23 AM, <andy.mastellone_at_alice.it> wrote:
>  Hi Mahesh,
>
> perhaps do you mean the precompiled version that one downloads from Charm++
> web site ? In the case, I would need a mpi-linux-ia64 on intel compilers
> (ifort and ecc) that I do not see on their site...
>
> Thank you for reply...
>
> Andrea
>
> -----Messaggio originale-----
> Da: mahesh kulharia [mailto:kulharia_at_googlemail.com<kulharia_at_googlemail.com>
> ]
> Inviato: gio 22/01/2009 15.14
> A: andy.mastellone_at_alice.it
> Oggetto: Re: namd-l: Running Charm++ over MPI does not work
>
> Run binaries. I had the same problems and binaries are running properly for
> me.
>
> regards
>
> 2009/1/22 <andy.mastellone_at_alice.it>
>
> >  Hi,
> >
> > before installing NAMD I have some problems related to running charm test
> > routines when dealing with MPI. Both on a single host (ia32 with an
> OpenMPI
> > implementation) and a multicore host (ia64 with an NECmpi environment) I
> > built successfully charm++ including the MPI files issuing
> >
> > ./build LIBS mpi-linux-ia32(64) icc(ecc) ifort -O2 -DCMK_OPTIMIZE
> >
> > (I have Intel compilers)
> > but, when I give a make in
> >
> > tests/charm++/simplearrayhello
> >
> > it complains about
> >
> > andy charm-6.0/tests/charm++/simplearrayhello 114 % make
> > ./../../bin/charmc  -language charm++ -o hello hello.o
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function
> > `CmiNotifyIdle':
> > machine.c:(.text+0x202): undefined reference to `MPI_Recv'
> > machine.c:(.text+0x232): undefined reference to `MPI_Get_count'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function
> > `CmiAsyncSendFn_':
> > machine.c:(.text+0xe82): undefined reference to `MPI_Isend'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function
> > `CmiAsyncBroadcastAllFn':
> > machine.c:(.text+0x11c2): undefined reference to `MPI_Test'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function
> > `CmiAsyncBroadcastFn':
> > machine.c:(.text+0x1622): undefined reference to `MPI_Test'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function
> > `CmiReleaseSentMessages':
> > machine.c:(.text+0x2702): undefined reference to `MPI_Test'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function `PumpMsgs':
> > machine.c:(.text+0x28f2): undefined reference to `MPI_Iprobe'
> > machine.c:(.text+0x2942): undefined reference to `MPI_Get_count'
> > machine.c:(.text+0x2992): undefined reference to `MPI_Recv'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function
> > `CmiAsyncMsgSent':
> > machine.c:(.text+0x2e52): undefined reference to `MPI_Test'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function
> > `CmiBarrierZero':
> > machine.c:(.text+0x3182): undefined reference to `MPI_Recv'
> > machine.c:(.text+0x3232): undefined reference to `MPI_Send'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function
> `CmiCpuTimer':
> > machine.c:(.text+0x32d2): undefined reference to `MPI_Wtime'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function
> > `CmiWallTimer':
> > machine.c:(.text+0x3352): undefined reference to `MPI_Wtime'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function `CmiTimer':
> > machine.c:(.text+0x33d2): undefined reference to `MPI_Wtime'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function
> `CmiBarrier':
> > machine.c:(.text+0x34a2): undefined reference to `MPI_Barrier'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function
> > `CmiTimerIsSynchronized':
> > machine.c:(.text+0x3582): undefined reference to `MPI_Attr_get'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function
> > `CmiTimerInit':
> > machine.c:(.text+0x3812): undefined reference to `MPI_Wtime'
> > machine.c:(.text+0x3852): undefined reference to `MPI_Allreduce'
> > machine.c:(.text+0x38c2): undefined reference to `MPI_Wtime'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function
> > `ConverseInit':
> > machine.c:(.text+0x3962): undefined reference to `MPI_Init_thread'
> > machine.c:(.text+0x3992): undefined reference to `MPI_Comm_size'
> > machine.c:(.text+0x39c2): undefined reference to `MPI_Comm_rank'
> > machine.c:(.text+0x39e2): undefined reference to `MPI_Get_version'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function
> > `ConverseExit':
> > machine.c:(.text+0x44e2): undefined reference to `MPI_Test'
> > machine.c:(.text+0x4532): undefined reference to `MPI_Barrier'
> > machine.c:(.text+0x4592): undefined reference to `MPI_Finalize'
> > machine.c:(.text+0x46a2): undefined reference to `MPI_Test'
> > ./../../bin/../lib/libconv-cplus-y.a(machine.o): In function `CmiAbort':
> > machine.c:(.text+0x47f2): undefined reference to `MPI_Abort'
> > Fatal Error by charmc in directory
> > /u1/maad559/charm-6.0/tests/charm++/simplearrayhello
> >    Command ecpc -rdynamic -o hello -L../../../bin/../lib
> > -I../../../bin/../include ../../../bin/../lib/libldb-rand.o hello.o
> > moduleinit4822.o ../../../bin/../lib/libmemory-default.o
> > ./../../bin/../lib/libthreads-default.o -lck -lconv-cplus-y -lconv-core
> > -lconv-util -lckqt -llammpio -llammpi++ -llamf77mpi -lmpi -llam -laio
> -laio
> > -lutil -ldl -lm returned error code 1
> > charmc exiting...
> > make: *** [hello] Error 1
> >
> > Conversely, the UDP version (net-linux-....) is fine in all tests in both
> > the machines.
> >
> > A suggest/help would be very helpful at this stage. Please don't esitate
> to
> > ask me further details if required. Thank you...
> >
>
>
>
> --
> Kulharia
> Department of youknowwhat
> University of youknowwhere
>
>

This archive was generated by hypermail 2.1.6 : Wed Feb 29 2012 - 05:21:40 CST