[mpich-discuss] valgrind is indicating problems in MPICH2 when using the MPE library
John Michael Alex Grime
jgrime at uchicago.edu
Fri Feb 15 14:49:46 CST 2013
Hello all,
If I compile an MPI code using the "-llmpe" and "-lmpe" options to enable MPI tracing, running the resultant code through valgrind ( "mpiexec -np X valgrind --track-origins=yes ./program ..." ) produces several instances of the following type of message:
==459== Conditional jump or move depends on uninitialised value(s)
==459== at 0x10001C949: MPE_Req_wait_test (in ./program)
==459== by 0x10001D4B0: MPI_Waitall (in ./program)
==459== by 0x100047FE1: AtomShare::Reset(std::map<int, std::map<int, int, std::less<int>, std::allocator<std::pair<int const, int> > >, std::less<int>, std::allocator<std::pair<int const, std::map<int, int, std::less<int>, std::allocator<std::pair<int const, int> > > > > >&, int) (in ./program)
==459== by 0x100051DD5: test(int, char**) (in ./program)
==459== by 0x100052FED: main (in ./program)
==459== Uninitialised value was created by a stack allocation
==459== at 0x10001D53D: MPI_Waitall (in ./program)
... whereas compiling and running the same code without the "-llmpe" and "-lmpe" options does not trigger this warning. I've checked to make sure the MPI_Request arrays used in the MPI_Waitall() calls are indeed allocated and set to valid requests (from MPI_Isend() and MPI_Irecv()).
Has anyone else noticed any problems with MPE using uninitialized stack variables?
For what it's worth, running mpiexec -info gives:
HYDRA build details:
Version: 1.4.1p1
Release Date: Thu Sep 1 13:53:02 CDT 2011
CC: /usr/bin/gcc-4.2 -I/opt/local/include -pipe -O2 -arch x86_64 -L/opt/local/lib -arch x86_64
CXX: /usr/bin/g++-4.2 -I/opt/local/include -pipe -O2 -arch x86_64 -L/opt/local/lib -arch x86_64
F77:
F90:
Configure options: '--prefix=/opt/local' '--with-thread-package=posix' '--enable-timer-type=mach_absolute_time' '--enable-cxx' '--mandir=/opt/local/share/man' '--docdir=/opt/local/share/doc/mpich2' '--htmldir=/opt/local/share/doc/mpich2' '--includedir=/opt/local/include/mpich2' '--disable-f77' '--disable-fc' '--with-mpe' '--with-device=ch3:nemesis' 'F90FLAGS=' 'F90=' '--with-pm=hydra' '--enable-shared' '--enable-cache' '--enable-smpcoll' '--enable-base-cache' '--enable-sharedlibs=osx-gcc' 'CC=/usr/bin/gcc-4.2' 'LDFLAGS=-L/opt/local/lib -arch x86_64 -Wl,-flat_namespace' 'CPPFLAGS=-I/opt/local/include -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/mpl/include -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/mpl/include -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/openpa/src -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/openpa/src -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/mpid/ch3/include -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/mpid/ch3/include -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/mpid/common/datatype -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/mpid/common/datatype -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/mpid/common/locks -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/mpid/common/locks -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/mpid/ch3/channels/nemesis/include -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/mpid/ch3/channels/nemesis/include -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/mpid/ch3/channels/nemesis/nemesis/include -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/mpid/ch3/channels/nemesis/nemesis/include -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/mpid/ch3/channels/nemesis/nemesis/utils/monitor -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/mpid/ch3/channels/nemesis/nemesis/utils/monitor -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/util/wrappers -I/opt/local/var/macports/build/_Volumes_work_mports_dports_science_mpich2/mpich2/work/mpich2-1.4.1p1/src/util/wrappers' 'FFLAGS=-pipe -O2 -m64 ' 'CFLAGS=-pipe -O2 -arch x86_64 -O2' 'CXX=/usr/bin/g++-4.2' 'CXXFLAGS=-pipe -O2 -arch x86_64 -O2' '--disable-option-checking' 'LIBS=-lpthread '
Process Manager: pmi
Launchers available: ssh rsh fork slurm ll lsf sge manual persist
Topology libraries available: hwloc
Resource management kernels available: user slurm ll lsf sge pbs
Checkpointing libraries available:
Demux engines available: poll select
Cheers,
J.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20130215/de041a9e/attachment.html>
More information about the discuss
mailing list