[mpich-discuss] libmpi.so.12 error

Sophie Norman sophie.norman at gmail.com
Tue Sep 9 09:21:20 CDT 2014


Oops there is a version of OpenMPI on the system that is didn't find before.

Sangmin thank you, here are the two different versions

$ /usr/bin/mpiexec --versionmpiexec (OpenRTE) 1.4.5
Report bugs to http://www.open-mpi.org/community/help/


$ /home/rpimpi/mpich3-install/bin/mpiexec --version
HYDRA build details:
    Version:                                 3.1.2
    Release Date:                            Mon Jul 21 16:00:21 CDT 2014
    CC:                              gcc
    CXX:                             g++
    F77:                             gfortran
    F90:                             gfortran
    Configure options:                       '--disable-option-checking'
'--prefix=/home/rpimpi/mpich3-install' '--enable-shared'
'--cache-file=/dev/null'
'--srcdir=/home/pi/mpich3/mpich-3.1.2/src/pm/hydra' 'CC=gcc' 'CFLAGS= -O2'
'LDFLAGS= ' 'LIBS=-lrt -lpthread ' 'CPPFLAGS=
-I/home/pi/mpich3/src/mpl/include
-I/home/pi/mpich3/mpich-3.1.2/src/mpl/include
-I/home/pi/mpich3/mpich-3.1.2/src/openpa/src
-I/home/pi/mpich3/src/openpa/src -D_REENTRANT
-I/home/pi/mpich3/src/mpi/romio/include'
    Process Manager:                         pmi
    Launchers available:                     ssh rsh fork slurm ll lsf sge
manual persist
    Topology libraries available:            hwloc
    Resource management kernels available:   user slurm ll lsf sge pbs
cobalt
    Checkpointing libraries available:
    Demux engines available:                 poll select


Running the system so that it is MPICH that runs allowed the system to work
as it should! so thank you all for your help.
Just one last question, how can I set it so that it is the MPICH
configuration that runs not the OpenMPI without having to type
/home/rpimpi/mpich3-install/bin/mpiexec each time?

Yours happily

Sophie

On 9 September 2014 15:01, Seo, Sangmin <sseo at anl.gov> wrote:

>  Did you install MPICH-3.1.2 to /home/rpimpi/mpich3-install/? If so and
> you did not specify --bindir, /usr/bin/mpiexec might be installed by a
> different MPI implementation. Can you try these?
>
>  /usr/bin/mpiexec --version
> /home/rpimpi/mpich3-install/bin/mpiexec --version
>
>  If they are different, can you also try to use
> /home/rpimpi/mpich3-install/bin/mpiexec to execute your application?
>
>  — Sangmin
>
>  On Sep 9, 2014, at 8:44 AM, Sophie Norman <sophie.norman at gmail.com>
> wrote:
>
>  Sure Sangmin, here you go
>
>  $ mpiexec -n 4 -machinefile /home/pi/mpi_testing/machinefile which
> mpiexec
> /usr/bin/mpiexec
> /usr/bin/mpiexec
> /usr/bin/mpiexec
> /usr/bin/mpiexec
>
>
>  Sophie
>
> On 9 September 2014 13:27, Seo, Sangmin <sseo at anl.gov> wrote:
>
>>  To verify the path to mpiexec on nodes, can you try the following?
>>
>>  mpiexec -n 4 -machinefile /home/pi/mpi_testing/__machinefile which
>> mpiexec
>>
>>
>>  — Sangmin
>>
>>
>>  On Sep 9, 2014, at 6:37 AM, Sophie Norman <sophie.norman at gmail.com>
>> wrote:
>>
>>  I've looked and there's definitely no OpenMPI installation, there is
>> however a Python-mpi4py installation, could that be causing the problem?
>>
>> On 9 September 2014 12:08, Kenneth Raffenetti <raffenet at mcs.anl.gov>
>> wrote:
>>
>>> That looks like an error message from OpenMPI's mpiexec. That is surely
>>> part of the problem here. Do you have an OpenMPI installation somewhere
>>> that may be conflicting with your MPICH? One solution would be to make sure
>>> that the path to MPICH's mpiexec is first in your environment.
>>>
>>> Ken
>>>
>>> On 09/09/2014 05:31 PM, Sophie Norman wrote:
>>>
>>>> Hi Ken
>>>>
>>>> When I run that code I get the following error
>>>>
>>>> ------------------------------------------------------------
>>>> --------------
>>>> mpiexec was unable to launch the specified application as it could not
>>>> find an executable:
>>>>
>>>> Executable: -l
>>>> Node: raspberrypi
>>>>
>>>> while attempting to start process rank 0.
>>>> ------------------------------------------------------------
>>>> --------------
>>>>
>>>> Sophie
>>>>
>>>> On 9 September 2014 02:07, Kenneth Raffenetti <raffenet at mcs.anl.gov
>>>> <mailto:raffenet at mcs.anl.gov>> wrote:
>>>>
>>>>     Let's try to verify that the library is properly located on all the
>>>>     nodes you are running. We can do a simple test with mpiexec. Try
>>>>     running:
>>>>
>>>>        mpiexec -n 4 -machinefile /home/pi/mpi_testing/____machinefile
>>>> -l
>>>>     ls /home/rpimpi/mpich3-install/__lib/libmpi.so.12
>>>>
>>>>     Ken
>>>>
>>>>     On 09/08/2014 07:03 PM, Sophie Norman wrote:
>>>>
>>>>         Hi Ken,
>>>>         Yes I used mpicc to build the binary
>>>>
>>>>         the oputput of mpicc -show is
>>>>
>>>>         gcc -I/home/rpimpi/mpich3-install/__include
>>>>         -L/home/rpimpi/mpich3-install/__lib -Wl,-rpath
>>>>         -Wl,/home/rpimpi/mpich3-__install/lib -Wl,--enable-new-dtags
>>>>         -lmpi -lrt
>>>>         -lpthread
>>>>
>>>>
>>>>         Sophie
>>>>
>>>>         On 9 September 2014 00:48, Kenneth Raffenetti
>>>>         <raffenet at mcs.anl.gov <mailto:raffenet at mcs.anl.gov>
>>>>         <mailto:raffenet at mcs.anl.gov <mailto:raffenet at mcs.anl.gov>>>
>>>> wrote:
>>>>
>>>>              Did you use mpicc to build your binary? What is the output
>>>>         of 'mpicc
>>>>              -show'. The compile wrappers should encode an RPATH tag in
>>>> your
>>>>              program so that setting an LD_LIBRARY_PATH to find
>>>> libmpi.so is
>>>>              unncessary.
>>>>
>>>>              Ken
>>>>
>>>>
>>>>              On 09/08/2014 06:44 PM, Sophie Norman wrote:
>>>>
>>>>                  Hi,
>>>>
>>>>                  When running NAS parallel benchmarks using 4 Raspberry
>>>>         Pi nodes each
>>>>                  running MPICH-3.1.2 I get the following error when I
>>>>         try to run the
>>>>                  integer sort benchmark
>>>>
>>>>                  ~/NPB3.3/NPB3.3-MPI/bin $ mpiexec -n 4 -machinefile
>>>>                  /home/pi/mpi_testing/____machinefile ./is.S.4
>>>>
>>>>
>>>>                     NAS Parallel Benchmarks 3.3 -- IS Benchmark
>>>>
>>>>                     Size:  65536  (class S)
>>>>                     Iterations:   10
>>>>                     Number of processes:     1
>>>>
>>>>                     ERROR: compiled for 4 processes
>>>>                     Number of active processes: 1
>>>>                     Exiting program!
>>>>
>>>>                  ./is.S.4: error while loading shared libraries:
>>>>         libmpi.so.12: cannot
>>>>                  open shared object file: No such file or directory
>>>>                  ./is.S.4: error while loading shared libraries:
>>>>         libmpi.so.12: cannot
>>>>                  open shared object file: No such file or directory
>>>>                  ./is.S.4: error while loading shared libraries:
>>>>         libmpi.so.12: cannot
>>>>                  open shared object file: No such file or directory
>>>>
>>>>
>>>>
>>>>                  I have tried
>>>>                  export
>>>>
>>>>         LD_LIBRARY_PATH=$LD_LIBRARY_____PATH:/home/rpimpi/mpich3-___
>>>> _install/lib
>>>>                  on each of the nodes, and each node works individually
>>>>         when run with
>>>>                     the file ./is.S.1
>>>>
>>>>                  $ldd ./is.S.4 give the following output for the master
>>>> node
>>>>
>>>>                  /usr/lib/arm-linux-gnueabihf/____libcofi_rpi.so
>>>>         (0xb6edf000)
>>>>                  libmpi.so.12 =>
>>>>         /home/rpimpi/mpich3-install/____lib/libmpi.so.12
>>>>                  (0xb6ca3000)
>>>>                  libc.so.6 => /lib/arm-linux-gnueabihf/libc.____so.6
>>>>         (0xb6b68000)
>>>>                  librt.so.1 => /lib/arm-linux-gnueabihf/____librt.so.1
>>>>         (0xb6b59000)
>>>>                  libpthread.so.0 =>
>>>>         /lib/arm-linux-gnueabihf/____libpthread.so.0
>>>>                  (0xb6b3a000)
>>>>                  libgfortran.so.3 =>
>>>>         /usr/lib/arm-linux-gnueabihf/____libgfortran.so.3
>>>>                  (0xb6a93000)
>>>>                  libm.so.6 => /lib/arm-linux-gnueabihf/libm.____so.6
>>>>         (0xb6a22000)
>>>>                  libgcc_s.so.1 => /lib/arm-linux-gnueabihf/____
>>>> libgcc_s.so.1
>>>>                  (0xb69fa000)
>>>>                  /lib/ld-linux-armhf.so.3 (0xb6eed000)
>>>>
>>>>
>>>>                  please help
>>>>
>>>>                  Thanks
>>>>
>>>>                  Sophie
>>>>
>>>>
>>>>                  ___________________________________________________
>>>>                  discuss mailing list discuss at mpich.org
>>>>         <mailto:discuss at mpich.org> <mailto:discuss at mpich.org
>>>>         <mailto:discuss at mpich.org>>
>>>>                  To manage subscription options or unsubscribe:
>>>>         https://lists.mpich.org/____mailman/listinfo/discuss
>>>>         <https://lists.mpich.org/__mailman/listinfo/discuss>
>>>>                  <https://lists.mpich.org/__mailman/listinfo/discuss
>>>>         <https://lists.mpich.org/mailman/listinfo/discuss>>
>>>>
>>>>              ___________________________________________________
>>>>              discuss mailing list discuss at mpich.org
>>>>         <mailto:discuss at mpich.org> <mailto:discuss at mpich.org
>>>>         <mailto:discuss at mpich.org>>
>>>>              To manage subscription options or unsubscribe:
>>>>         https://lists.mpich.org/____mailman/listinfo/discuss
>>>>         <https://lists.mpich.org/__mailman/listinfo/discuss>
>>>>              <https://lists.mpich.org/__mailman/listinfo/discuss
>>>>         <https://lists.mpich.org/mailman/listinfo/discuss>>
>>>>
>>>>
>>>>
>>>>
>>>>         _________________________________________________
>>>>         discuss mailing list discuss at mpich.org <mailto:
>>>> discuss at mpich.org>
>>>>         To manage subscription options or unsubscribe:
>>>>         https://lists.mpich.org/__mailman/listinfo/discuss
>>>>         <https://lists.mpich.org/mailman/listinfo/discuss>
>>>>
>>>>     _________________________________________________
>>>>     discuss mailing list discuss at mpich.org <mailto:discuss at mpich.org>
>>>>     To manage subscription options or unsubscribe:
>>>>     https://lists.mpich.org/__mailman/listinfo/discuss
>>>>     <https://lists.mpich.org/mailman/listinfo/discuss>
>>>>
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> discuss mailing list     discuss at mpich.org
>>>> To manage subscription options or unsubscribe:
>>>> https://lists.mpich.org/mailman/listinfo/discuss
>>>>
>>>>  _______________________________________________
>>> discuss mailing list     discuss at mpich.org
>>> To manage subscription options or unsubscribe:
>>> https://lists.mpich.org/mailman/listinfo/discuss
>>>
>>
>>  _______________________________________________
>> discuss mailing list     discuss at mpich.org
>> To manage subscription options or unsubscribe:
>> https://lists.mpich.org/mailman/listinfo/discuss
>>
>>
>>
>> _______________________________________________
>> discuss mailing list     discuss at mpich.org
>> To manage subscription options or unsubscribe:
>> https://lists.mpich.org/mailman/listinfo/discuss
>>
>
>  _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss
>
>
>
> _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20140909/f11c74e3/attachment.html>


More information about the discuss mailing list