<div dir="ltr"><div><div><br></div><div>hi,<br><br></div><div>I can get time to time 6 processs running and sometime it crash when 5 started,...<br></div><div>so takeing third machine, what so ever start's the problem,... it's question<br>
</div><div>communication between two slave's,... and  crash when two salves are<br></div><div>started to work together  mpirin -np 2 -hosts ugh,kaak ls<br></div><div><br></div><div>File rights should be  ok, ports are open, useing rsh,....<br>
<br></div><div>Communication, demux engine ?<br></div><div>Could PAM controll affect ?<br></div><div>Still wondering NFS4 ?<br></div><div><br></div><div>joni<br><br></div><div>hosts<br>======<br></div><div>mpi1:2<br></div>
<div>ugh:2<br></div><div>kaak:2<br></div><div><br></div><div>Below error's debug after crash<br>======================<br>mpiexec -np 5 hostname<br>....<br></div><div>[mpiexec@mpi1] Launch arguments: /mpi3/C3/mpich-3.0.4/bin/hydra_pmi_proxy --control-port <a href="http://192.168.0.41:7000">192.168.0.41:7000</a> --debug --rmk user --launcher rsh --launcher-exec /usr/bin/rsh-redone-rsh --demux poll --pgid 0 --retries 2 --usize -2 --proxy-id 0 <br>
[mpiexec@mpi1] Launch arguments: /usr/bin/rsh-redone-rsh 192.168.0.42 "/mpi3/C3/mpich-3.0.4/bin/hydra_pmi_proxy" --control-port <a href="http://192.168.0.41:7000">192.168.0.41:7000</a> --debug --rmk user --launcher rsh --launcher-exec /usr/bin/rsh-redone-rsh --demux poll --pgid 0 --retries 2 --usize -2 --proxy-id 1 <br>
[mpiexec@mpi1] Launch arguments: /usr/bin/rsh-redone-rsh 192.168.0.43 "/mpi3/C3/mpich-3.0.4/bin/hydra_pmi_proxy" --control-port <a href="http://192.168.0.41:7000">192.168.0.41:7000</a> --debug --rmk user --launcher rsh --launcher-exec /usr/bin/rsh-redone-rsh --demux poll --pgid 0 --retries 2 --usize -2 --proxy-id 2 <br>
mpi1<br>mpi1<br>^X^C[mpiexec@mpi1] Sending Ctrl-C to processes as requested<br>[mpiexec@mpi1] Press Ctrl-C again to force abort<br>[mpiexec@mpi1] HYDU_sock_write (./utils/sock/sock.c:291): write error (Bad file descriptor)<br>
[mpiexec@mpi1] HYD_pmcd_pmiserv_send_signal (./pm/pmiserv/pmiserv_cb.c:170): unable to write data to proxy<br>[mpiexec@mpi1] ui_cmd_cb (./pm/pmiserv/pmiserv_pmci.c:79): unable to send signal downstream<br>[mpiexec@mpi1] HYDT_dmxu_poll_wait_for_event (./tools/demux/demux_poll.c:77): callback returned error status<br>
[mpiexec@mpi1] HYD_pmci_wait_for_completion (./pm/pmiserv/pmiserv_pmci.c:197): error waiting for event<br>[mpiexec@mpi1] main (./ui/mpich/mpiexec.c:331): process manager error waiting for completion<br>joni@mpi1:~$ echo $HYDRA_DEMUX<br>
select<br>joni@mpi1:~$ sudo killall rsh-redone-rsh<br>[sudo] password for joni: <br>rsh-redone-rsh: ei prosesseja<br>joni@mpi1:~$ <br><br><br></div><div>Below blcr test after re-compile<br></div><div>====================== <br>
</div><div><div>root@mpi1:/mpi3/S3/blcr-0.8.5# master kaak errors:<br><br>34 child 21967 completed<br>#ST_ALARM:60<br>035 child 22072 is READY (context=SIGNAL stopped=YES)<br>036 child 22072 is STOPped<br>!!! Alarm clock expired<br>
!!! Missing final DONE<br>!!! Test killed unexpectedly by signal 9<br>FAIL: <a href="http://stopped.st">stopped.st</a><br><br><br>root@ug:/mpi3/S3/blcr-0.8.5# slave kaak errors:<br><br><br>root@kaak:/mpi3/S3/blcr-0.8.5# slave kaak errors:<br>
<br><br>make check<br><br>34 child 7472 completed<br>#ST_ALARM:60<br>035 child 7577 is READY (context=SIGNAL stopped=YES)<br>036 child 7577 is STOPped<br>!!! Alarm clock expired<br>!!! Missing final DONE<br>!!! Test killed unexpectedly by signal 9<br>
FAIL: <a href="http://stopped.st">stopped.st</a><br>PASS: <a href="http://edeadlk.st">edeadlk.st</a><br><br>/mpi3/S3/blcr-0.8.5/tests/.libs/lt-filedescriptors[8051]: file "filedescriptors.c", line 270, in check_stat_simple: File attributes changed.  1 mismatches<br>
/mpi3/S3/blcr-0.8.5/tests/.libs/lt-filedescriptors[8051]: file "crut.c", line 615, in crut_main: test_restart() unexpectedly returned -1<br>restart/nonzeroexit (255)<br>FAIL: filedescriptors.ct<br><br><br><br></div>
</div></div></div><div class="gmail_extra"><br><br><div class="gmail_quote">2013/8/27 Joni-Pekka Kurronen <span dir="ltr"><<a href="mailto:joni.kurronen@gmail.com" target="_blank">joni.kurronen@gmail.com</a>></span><br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div><br></div><div><br></div>Another:<br><div><br>joni@mpi1:/mpi3/S3/mpich-3.0.4$ mpiexec -np 6 hostname<br>
host: 192.168.0.41<br>host: 192.168.0.42<br>host: 192.168.0.43<br><br>==================================================================================================<br>
mpiexec options:<br>----------------<br>  Base path: /mpi3/C3/mpich-3.0.4/bin/<br>  Launcher: (null)<br>  Debug level: 1<br>  Enable X: -1<br><br>  Global environment:<br>  -------------------<div><div class="h5"><br>    MUMPS=/mpi3/S3/MUMPS_4.10.0<br>

    LC_PAPER=fi_FI.UTF-8<br>    LC_ADDRESS=fi_FI.UTF-8<br>    SSH_AGENT_PID=12144<br>    LC_MONETARY=fi_FI.UTF-8<br>    MUMPS_I=/mpi3/C3/MUMPS_4.10.0<br>    HYDRA_DEMUX=select<br>    GPG_AGENT_INFO=/tmp/keyring-kJwpJQ/gpg:0:1<br>

    JPK_LMETISDIR_S5=/mpi3/S3/parmetis-4.0.2<br>    TERM=xterm<br>    SHELL=/bin/bash<br>    XDG_SESSION_COOKIE=6d6390cb56a32b6678c10da600000412-1377606907.629665-1922379047<br>    FFT=/mpi3/C3/fftw2<br>    HYDRA_ENV=all<br>

    JPK_NETGEN=/mpi3/C3/netgen_668<br>    JPK_VER_S=S3<br>    HYDRA_CKPOINTLIB=blcr<br>    HYDRA_CKPOINT_INTERVAL=10800<br>    WINDOWID=54602522<br>    LC_NUMERIC=fi_FI.UTF-8<br>    HYDRA_CKPOINT_PREFIX=/mpi3/chekpoint/default.chk<br>

    GNOME_KEYRING_CONTROL=/tmp/keyring-kJwpJQ<br>    JPK_ELMER=/mpi3/C3/elmer_6283<br>    PARDISO_LIC_PATH=/mpi3/C3/pardiso<br>    METIS_INCLUDE_DIR=/mpi3/C3/ParMetis-3.2.0<br>    JPK_NETGEN_S=/mpi3/S3/netgen_668<br>    USER=joni<br>

    LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.axa=00;36:*.oga=00;36:*.spx=00;36:*.xspf=00;36:<br>

    JPK_TOGL=/mpi3/C3/Togl-1.7<br>    LD_LIBRARY_PATH=/mpi3/C3/mpich-3.0.4/lib:/mpi3/C3/mpich-3.0.4/bin:/mpi3/C3/blcr-0.8.5/lib:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/scotch_6.0.0/lib:/mpi3/S3/MUMPS_4.10.0/lib:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/mpi3/C3/scalapack-2.0.2/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/pardiso:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/ARPACK:/mpi3/C3/hdf5-1.8.10-patch1/lib:/mpi3/C3/VTK-5.8.0/lib/vtk-5.8:/mpi3/C3/elmer_6283/lib:/mpi3/C3/Togl-1.7:/mpi3/C3/netgen_668/lib:/usr/lib/:/usr/local/lib://mpi3/C3/vrpn/lib://mpi3/C3/hidapi/lib:/usr/include/libusb-1.0<br>

    LC_TELEPHONE=fi_FI.UTF-8<br>    XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session0<br>    JPK_OCC=/usr/include/oce<br>    XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0<br>    HYDRA_HOST_FILE=/mpi4/hosts<br>

    SSH_AUTH_SOCK=/tmp/ssh-NnhxNTH12143/agent.12143<br>    SCOTCHDIR=/mpi3/C3/scotch_6.0.0<br>    HYDRA_LAUNCHER=rsh<br>    JPK_VER_B=B3<br>    SESSION_MANAGER=local/mpi1:@/tmp/.ICE-unix/4284,unix/mpi1:/tmp/.ICE-unix/4284<br>

    DEFAULTS_PATH=/usr/share/gconf/ubuntu.default.path<br>    ELMER_HOME=/mpi3/C3/elmer_6283<br>    BLACS=/mpi3/C3/scalapack-2.0.2<br>    BLAS32=/mpi3/C3/acml5.3.1/gfortran64_mp<br>    METIS_DIR=<br></div></div><div><div class="h5">
    MPI_LIBS=-L/mpi3/C3/mpich-3.0.4/lib -lmpich -lmpichf90 -lmpl -lopa -lmpichcxx<br>
    XDG_CONFIG_DIRS=/etc/xdg/xdg-ubuntu:/etc/xdg<br>    JPK_MPI_DIR=/mpi3<br>    JPK_HDF5_S=/mpi3/S3/hdf5-1.8.10-patch1<br>    MPIEXEC_PORT_RANGE=7000:7500<br>    PATH=/mpi3/C3/cmake-2.8.10.2/bin:/mpi3/C3/blcr-0.8.5/bin:/mpi3/C3/mpich-3.0.4/bin:/usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/mpi3/C3/elmer_6283/bin:/mpi3/C3/elmer_6283/lib:/mpi3/C3/ParaView3<br>

    DESKTOP_SESSION=ubuntu<br>    BLAS=/mpi3/C3/acml5.3.1/gfortran64_mp<br>    METIS_LIBDIR=/mpi3/C3/ParMetis-3.2.0<br>    CMAKE_COMMAND=/mpi3/C3/cmake-2.8.10.2/bin<br>    QT_QMAKE_EXECUTABLE=/usr/bin/qmake-qt4<br>    LC_IDENTIFICATION=fi_FI.UTF-8<br>

    JPK_SCOTCHDIR_S=/mpi3/S3/scotch_6.0.0_esmumps<br>    JPK_LMETISDIR_S=/mpi3/S3/ParMetis-3.2.0<br>    PWD=/mpi3/S3/mpich-3.0.4<br>    NETGENDIR=/mpi3/C3/netgen_668/bin<br>    EDITOR=nano<br>    JPK_LMETISDIR=/mpi3/C3/ParMetis-3.2.0<br>

    GNOME_KEYRING_PID=4273<br>    LANG=fi_FI.UTF-8<br>    MANDATORY_PATH=/usr/share/gconf/ubuntu.mandatory.path<br>    OMPI_77=//mpi3/C3/mpich-3.0.4/bin/mpif77<br>    LC_MEASUREMENT=fi_FI.UTF-8<br>    JPK_HDF5=/mpi3/C3/hdf5-1.8.10-patch1<br>

    UBUNTU_MENUPROXY=libappmenu.so<br>    COMPIZ_CONFIG_PROFILE=ubuntu<br>    ELMER_POST_HOME=/mpi3/C3/elmer_6283/bin<br>    JPK_INS=/mpi3/C3<br>    ELMER_LIB=/mpi3/C3/elmer_6283/share/elmersolver/lib<br>    HYDRA_PROXY_RETRY_COUNT=3<br>

    GDMSESSION=ubuntu<br>    JPK_ELMER_S=/mpi3/S3/elmer_6283<br>    JPK_LMETISDIR5=/mpi3/C3/parmetis-4.0.2<br>    JPK_LMETISDIR32=/mpi3/C3/ParMetis-3.2.0<br>    HYDRA_DEBUG=1<br>    JPK_BUI=/mpi3/S3<br>    VTK_INCLUDEPATH=/mpi3/C3/VTK-5.8.0/include<br>

    SHLVL=1<br>    HOME=/home/joni<br>    OMPI_CC=//mpi3/C3/mpich-3.0.4/bin/mpicc<br>    LANGUAGE=fi:en<br>    OMPI_90=//mpi3/C3/mpich-3.0.4/bin/mpif90<br>    ELMERGUI_HOME=/mpi3/C3/elmer_6283/bin<br>    GNOME_DESKTOP_SESSION_ID=this-is-deprecated<br>

    MPI_IMPLEMENTATION=mpich<br>    MKL_SERIAL=YES<br>    LOGNAME=joni<br>    HYPRE=/mpi3/C3/hypre-2.8.0b<br>    JPK_ARPACK_S=/mpi3/S3/ARPACK<br>    JPK_JOBS=7<br>    JPK_VTK_DIR=/mpi3/C3/VTK-5.8.0<br>    SCALAPACK=/mpi3/C3/scalapack-2.0.2<br>

    XDG_DATA_DIRS=/usr/share/ubuntu:/usr/share/gnome:/usr/local/share/:/usr/share/<br>    DBUS_SESSION_BUS_ADDRESS=unix:abstract=/tmp/dbus-gAYmjGaklf,guid=26c3f15a7a5ee55e8782415700000034<br>    JPK_ARPACK=/mpi3/C3/ARPACK<br>

    MPI_HOME=/mpi3/C3/mpich-3.0.4<br>    LESSOPEN=| /usr/bin/lesspipe %s<br>    LACPACK=/mpi3/C3/acml5.3.1/gfortran64_mp<br>    OMPI_CXX=//mpi3/C3/mpich-3.0.4/bin/mpicxx<br>    OMP_NUM_THREADS=6<br>    JPK_TOGL_S=/mpi3/S3/Togl-1.7<br>

    HYDRA_LAUNCHER_EXEC=/usr/bin/rsh-redone-rsh<br>    JPK_MPICH2=/mpi3/C3/mpich-3.0.4<br>    PARDISO=/mpi3/C3/pardiso<br>    PARDISOLICMESSAGE=1<br>    JPK_VER=C3<br>    XDG_CURRENT_DESKTOP=Unity<br>    LESSCLOSE=/usr/bin/lesspipe %s %s<br>

    LC_TIME=fi_FI.UTF-8<br>    JPK_HYPRE_S=/mpi3/S3/hypre-2.8.0b<br>    JPK_MPICH2_S=/mpi3/S3/mpich-3.0.4<br>    COLORTERM=gnome-terminal<br>    XAUTHORITY=/home/joni/.Xauthority<br>    LC_NAME=fi_FI.UTF-8<br>    _=/mpi3/C3/mpich-3.0.4/bin/mpiexec<br>

    OLDPWD=/home/joni<br><br>  Hydra internal environment:<br>  ---------------------------<br>    MPICH_ENABLE_CKPOINT=1<br>    GFORTRAN_UNBUFFERED_PRECONNECTED=y<br><br><br>    Proxy information:<br>    *********************<br>

      [1] proxy: 192.168.0.41 (2 cores)<br></div></div>      Exec list: hostname (2 processes); <br><div class="im"><br>      [2] proxy: 192.168.0.42 (2 cores)<br></div>      Exec list: hostname (2 processes); <br><br>      [3] proxy: 192.168.0.43 (2 cores)<br>

      Exec list: hostname (2 processes); <br><div class="im"><br><br>==================================================================================================<br><br>[mpiexec@mpi1] Timeout set to -1 (-1 means infinite)<br>
[mpiexec@mpi1] Got a control port string of <a href="http://192.168.0.41:7001" target="_blank">192.168.0.41:7001</a><br>
<br>Proxy launch args: /mpi3/C3/mpich-3.0.4/bin/hydra_pmi_proxy --control-port <a href="http://192.168.0.41:7001" target="_blank">192.168.0.41:7001</a> --debug --rmk user --launcher rsh --launcher-exec /usr/bin/rsh-redone-rsh --demux poll --pgid 0 --retries 3 --usize -2 --proxy-id <br>

<br>Arguments being passed to proxy 0:<br></div>--version 3.0.4 --iface-ip-env-name MPICH_INTERFACE_HOSTNAME --hostname 192.168.0.41 --global-core-map 0,2,6 --pmi-id-map 0,0 --global-process-count 6 --auto-cleanup 1 --pmi-kvsname kvs_12243_0 --pmi-process-mapping (vector,(0,3,2)) --ckpoint-prefix /mpi3/chekpoint/default.chk --ckpoint-num -1 --global-inherited-env 121 'MUMPS=/mpi3/S3/MUMPS_4.10.0' 'LC_PAPER=fi_FI.UTF-8' 'LC_ADDRESS=fi_FI.UTF-8' 'SSH_AGENT_PID=12144' 'LC_MONETARY=fi_FI.UTF-8' 'MUMPS_I=/mpi3/C3/MUMPS_4.10.0' 'HYDRA_DEMUX=select' 'GPG_AGENT_INFO=/tmp/keyring-kJwpJQ/gpg:0:1' 'JPK_LMETISDIR_S5=/mpi3/S3/parmetis-4.0.2' 'TERM=xterm' 'SHELL=/bin/bash' 'XDG_SESSION_COOKIE=6d6390cb56a32b6678c10da600000412-1377606907.629665-1922379047' 'FFT=/mpi3/C3/fftw2' 'HYDRA_ENV=all' 'JPK_NETGEN=/mpi3/C3/netgen_668' 'JPK_VER_S=S3' 'HYDRA_CKPOINTLIB=blcr' 'HYDRA_CKPOINT_INTERVAL=10800' 'WINDOWID=54602522' 'LC_NUMERIC=fi_FI.UTF-8' 'HYDRA_CKPOINT_PREFIX=/mpi3/chekpoint/default.chk' 'GNOME_KEYRING_CONTROL=/tmp/keyring-kJwpJQ' 'JPK_ELMER=/mpi3/C3/elmer_6283' 'PARDISO_LIC_PATH=/mpi3/C3/pardiso' 'METIS_INCLUDE_DIR=/mpi3/C3/ParMetis-3.2.0' 'JPK_NETGEN_S=/mpi3/S3/netgen_668' 'USER=joni' 'LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.axa=00;36:*.oga=00;36:*.spx=00;36:*.xspf=00;36:' 'JPK_TOGL=/mpi3/C3/Togl-1.7' 'LD_LIBRARY_PATH=/mpi3/C3/mpich-3.0.4/lib:/mpi3/C3/mpich-3.0.4/bin:/mpi3/C3/blcr-0.8.5/lib:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/scotch_6.0.0/lib:/mpi3/S3/MUMPS_4.10.0/lib:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/mpi3/C3/scalapack-2.0.2/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/pardiso:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/ARPACK:/mpi3/C3/hdf5-1.8.10-patch1/lib:/mpi3/C3/VTK-5.8.0/lib/vtk-5.8:/mpi3/C3/elmer_6283/lib:/mpi3/C3/Togl-1.7:/mpi3/C3/netgen_668/lib:/usr/lib/:/usr/local/lib://mpi3/C3/vrpn/lib://mpi3/C3/hidapi/lib:/usr/include/libusb-1.0' 'LC_TELEPHONE=fi_FI.UTF-8' 'XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session0' 'JPK_OCC=/usr/include/oce' 'XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0' 'HYDRA_HOST_FILE=/mpi4/hosts' 'SSH_AUTH_SOCK=/tmp/ssh-NnhxNTH12143/agent.12143' 'SCOTCHDIR=/mpi3/C3/scotch_6.0.0' 'HYDRA_LAUNCHER=rsh' 'JPK_VER_B=B3' 'SESSION_MANAGER=local/mpi1:@/tmp/.ICE-unix/4284,unix/mpi1:/tmp/.ICE-unix/4284' 'DEFAULTS_PATH=/usr/share/gconf/ubuntu.default.path' 'ELMER_HOME=/mpi3/C3/elmer_6283' 'BLACS=/mpi3/C3/scalapack-2.0.2' 'BLAS32=/mpi3/C3/acml5.3.1/gfortran64_mp' 'METIS_DIR=' 'MPI_LIBS=-L/mpi3/C3/mpich-3.0.4/lib -lmpich -lmpichf90 -lmpl -lopa -lmpichcxx' 'XDG_CONFIG_DIRS=/etc/xdg/xdg-ubuntu:/etc/xdg' 'JPK_MPI_DIR=/mpi3' 'JPK_HDF5_S=/mpi3/S3/hdf5-1.8.10-patch1' 'MPIEXEC_PORT_RANGE=7000:7500' 'PATH=/mpi3/C3/cmake-2.8.10.2/bin:/mpi3/C3/blcr-0.8.5/bin:/mpi3/C3/mpich-3.0.4/bin:/usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/mpi3/C3/elmer_6283/bin:/mpi3/C3/elmer_6283/lib:/mpi3/C3/ParaView3' 'DESKTOP_SESSION=ubuntu' 'BLAS=/mpi3/C3/acml5.3.1/gfortran64_mp' 'METIS_LIBDIR=/mpi3/C3/ParMetis-3.2.0' 'CMAKE_COMMAND=/mpi3/C3/cmake-2.8.10.2/bin' 'QT_QMAKE_EXECUTABLE=/usr/bin/qmake-qt4' 'LC_IDENTIFICATION=fi_FI.UTF-8' 'JPK_SCOTCHDIR_S=/mpi3/S3/scotch_6.0.0_esmumps' 'JPK_LMETISDIR_S=/mpi3/S3/ParMetis-3.2.0' 'PWD=/mpi3/S3/mpich-3.0.4' 'NETGENDIR=/mpi3/C3/netgen_668/bin' 'EDITOR=nano' 'JPK_LMETISDIR=/mpi3/C3/ParMetis-3.2.0' 'GNOME_KEYRING_PID=4273' 'LANG=fi_FI.UTF-8' 'MANDATORY_PATH=/usr/share/gconf/ubuntu.mandatory.path' 'OMPI_77=//mpi3/C3/mpich-3.0.4/bin/mpif77' 'LC_MEASUREMENT=fi_FI.UTF-8' 'JPK_HDF5=/mpi3/C3/hdf5-1.8.10-patch1' 'UBUNTU_MENUPROXY=libappmenu.so' 'COMPIZ_CONFIG_PROFILE=ubuntu' 'ELMER_POST_HOME=/mpi3/C3/elmer_6283/bin' 'JPK_INS=/mpi3/C3' 'ELMER_LIB=/mpi3/C3/elmer_6283/share/elmersolver/lib' 'HYDRA_PROXY_RETRY_COUNT=3' 'GDMSESSION=ubuntu' 'JPK_ELMER_S=/mpi3/S3/elmer_6283' 'JPK_LMETISDIR5=/mpi3/C3/parmetis-4.0.2' 'JPK_LMETISDIR32=/mpi3/C3/ParMetis-3.2.0' 'HYDRA_DEBUG=1' 'JPK_BUI=/mpi3/S3' 'VTK_INCLUDEPATH=/mpi3/C3/VTK-5.8.0/include' 'SHLVL=1' 'HOME=/home/joni' 'OMPI_CC=//mpi3/C3/mpich-3.0.4/bin/mpicc' 'LANGUAGE=fi:en' 'OMPI_90=//mpi3/C3/mpich-3.0.4/bin/mpif90' 'ELMERGUI_HOME=/mpi3/C3/elmer_6283/bin' 'GNOME_DESKTOP_SESSION_ID=this-is-deprecated' 'MPI_IMPLEMENTATION=mpich' 'MKL_SERIAL=YES' 'LOGNAME=joni' 'HYPRE=/mpi3/C3/hypre-2.8.0b' 'JPK_ARPACK_S=/mpi3/S3/ARPACK' 'JPK_JOBS=7' 'JPK_VTK_DIR=/mpi3/C3/VTK-5.8.0' 'SCALAPACK=/mpi3/C3/scalapack-2.0.2' 'XDG_DATA_DIRS=/usr/share/ubuntu:/usr/share/gnome:/usr/local/share/:/usr/share/' 'DBUS_SESSION_BUS_ADDRESS=unix:abstract=/tmp/dbus-gAYmjGaklf,guid=26c3f15a7a5ee55e8782415700000034' 'JPK_ARPACK=/mpi3/C3/ARPACK' 'MPI_HOME=/mpi3/C3/mpich-3.0.4' 'LESSOPEN=| /usr/bin/lesspipe %s' 'LACPACK=/mpi3/C3/acml5.3.1/gfortran64_mp' 'OMPI_CXX=//mpi3/C3/mpich-3.0.4/bin/mpicxx' 'OMP_NUM_THREADS=6' 'JPK_TOGL_S=/mpi3/S3/Togl-1.7' 'HYDRA_LAUNCHER_EXEC=/usr/bin/rsh-redone-rsh' 'JPK_MPICH2=/mpi3/C3/mpich-3.0.4' 'PARDISO=/mpi3/C3/pardiso' 'PARDISOLICMESSAGE=1' 'JPK_VER=C3' 'XDG_CURRENT_DESKTOP=Unity' 'LESSCLOSE=/usr/bin/lesspipe %s %s' 'LC_TIME=fi_FI.UTF-8' 'JPK_HYPRE_S=/mpi3/S3/hypre-2.8.0b' 'JPK_MPICH2_S=/mpi3/S3/mpich-3.0.4' 'COLORTERM=gnome-terminal' 'XAUTHORITY=/home/joni/.Xauthority' 'LC_NAME=fi_FI.UTF-8' '_=/mpi3/C3/mpich-3.0.4/bin/mpiexec' 'OLDPWD=/home/joni' --global-user-env 0 --global-system-env 2 'MPICH_ENABLE_CKPOINT=1' 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --genv-prop all --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2 --exec-local-env 0 --exec-wdir /mpi3/S3/mpich-3.0.4 --exec-args 1 hostname <br>
<div class="im">
<br>Arguments being passed to proxy 1:<br></div>--version 3.0.4 --iface-ip-env-name MPICH_INTERFACE_HOSTNAME --hostname 192.168.0.42 --global-core-map 0,2,6 --pmi-id-map 0,2 --global-process-count 6 --auto-cleanup 1 --pmi-kvsname kvs_12243_0 --pmi-process-mapping (vector,(0,3,2)) --ckpoint-prefix /mpi3/chekpoint/default.chk --ckpoint-num -1 --global-inherited-env 121 'MUMPS=/mpi3/S3/MUMPS_4.10.0' 'LC_PAPER=fi_FI.UTF-8' 'LC_ADDRESS=fi_FI.UTF-8' 'SSH_AGENT_PID=12144' 'LC_MONETARY=fi_FI.UTF-8' 'MUMPS_I=/mpi3/C3/MUMPS_4.10.0' 'HYDRA_DEMUX=select' 'GPG_AGENT_INFO=/tmp/keyring-kJwpJQ/gpg:0:1' 'JPK_LMETISDIR_S5=/mpi3/S3/parmetis-4.0.2' 'TERM=xterm' 'SHELL=/bin/bash' 'XDG_SESSION_COOKIE=6d6390cb56a32b6678c10da600000412-1377606907.629665-1922379047' 'FFT=/mpi3/C3/fftw2' 'HYDRA_ENV=all' 'JPK_NETGEN=/mpi3/C3/netgen_668' 'JPK_VER_S=S3' 'HYDRA_CKPOINTLIB=blcr' 'HYDRA_CKPOINT_INTERVAL=10800' 'WINDOWID=54602522' 'LC_NUMERIC=fi_FI.UTF-8' 'HYDRA_CKPOINT_PREFIX=/mpi3/chekpoint/default.chk' 'GNOME_KEYRING_CONTROL=/tmp/keyring-kJwpJQ' 'JPK_ELMER=/mpi3/C3/elmer_6283' 'PARDISO_LIC_PATH=/mpi3/C3/pardiso' 'METIS_INCLUDE_DIR=/mpi3/C3/ParMetis-3.2.0' 'JPK_NETGEN_S=/mpi3/S3/netgen_668' 'USER=joni' 'LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.axa=00;36:*.oga=00;36:*.spx=00;36:*.xspf=00;36:' 'JPK_TOGL=/mpi3/C3/Togl-1.7' 'LD_LIBRARY_PATH=/mpi3/C3/mpich-3.0.4/lib:/mpi3/C3/mpich-3.0.4/bin:/mpi3/C3/blcr-0.8.5/lib:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/scotch_6.0.0/lib:/mpi3/S3/MUMPS_4.10.0/lib:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/mpi3/C3/scalapack-2.0.2/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/pardiso:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/ARPACK:/mpi3/C3/hdf5-1.8.10-patch1/lib:/mpi3/C3/VTK-5.8.0/lib/vtk-5.8:/mpi3/C3/elmer_6283/lib:/mpi3/C3/Togl-1.7:/mpi3/C3/netgen_668/lib:/usr/lib/:/usr/local/lib://mpi3/C3/vrpn/lib://mpi3/C3/hidapi/lib:/usr/include/libusb-1.0' 'LC_TELEPHONE=fi_FI.UTF-8' 'XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session0' 'JPK_OCC=/usr/include/oce' 'XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0' 'HYDRA_HOST_FILE=/mpi4/hosts' 'SSH_AUTH_SOCK=/tmp/ssh-NnhxNTH12143/agent.12143' 'SCOTCHDIR=/mpi3/C3/scotch_6.0.0' 'HYDRA_LAUNCHER=rsh' 'JPK_VER_B=B3' 'SESSION_MANAGER=local/mpi1:@/tmp/.ICE-unix/4284,unix/mpi1:/tmp/.ICE-unix/4284' 'DEFAULTS_PATH=/usr/share/gconf/ubuntu.default.path' 'ELMER_HOME=/mpi3/C3/elmer_6283' 'BLACS=/mpi3/C3/scalapack-2.0.2' 'BLAS32=/mpi3/C3/acml5.3.1/gfortran64_mp' 'METIS_DIR=' 'MPI_LIBS=-L/mpi3/C3/mpich-3.0.4/lib -lmpich -lmpichf90 -lmpl -lopa -lmpichcxx' 'XDG_CONFIG_DIRS=/etc/xdg/xdg-ubuntu:/etc/xdg' 'JPK_MPI_DIR=/mpi3' 'JPK_HDF5_S=/mpi3/S3/hdf5-1.8.10-patch1' 'MPIEXEC_PORT_RANGE=7000:7500' 'PATH=/mpi3/C3/cmake-2.8.10.2/bin:/mpi3/C3/blcr-0.8.5/bin:/mpi3/C3/mpich-3.0.4/bin:/usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/mpi3/C3/elmer_6283/bin:/mpi3/C3/elmer_6283/lib:/mpi3/C3/ParaView3' 'DESKTOP_SESSION=ubuntu' 'BLAS=/mpi3/C3/acml5.3.1/gfortran64_mp' 'METIS_LIBDIR=/mpi3/C3/ParMetis-3.2.0' 'CMAKE_COMMAND=/mpi3/C3/cmake-2.8.10.2/bin' 'QT_QMAKE_EXECUTABLE=/usr/bin/qmake-qt4' 'LC_IDENTIFICATION=fi_FI.UTF-8' 'JPK_SCOTCHDIR_S=/mpi3/S3/scotch_6.0.0_esmumps' 'JPK_LMETISDIR_S=/mpi3/S3/ParMetis-3.2.0' 'PWD=/mpi3/S3/mpich-3.0.4' 'NETGENDIR=/mpi3/C3/netgen_668/bin' 'EDITOR=nano' 'JPK_LMETISDIR=/mpi3/C3/ParMetis-3.2.0' 'GNOME_KEYRING_PID=4273' 'LANG=fi_FI.UTF-8' 'MANDATORY_PATH=/usr/share/gconf/ubuntu.mandatory.path' 'OMPI_77=//mpi3/C3/mpich-3.0.4/bin/mpif77' 'LC_MEASUREMENT=fi_FI.UTF-8' 'JPK_HDF5=/mpi3/C3/hdf5-1.8.10-patch1' 'UBUNTU_MENUPROXY=libappmenu.so' 'COMPIZ_CONFIG_PROFILE=ubuntu' 'ELMER_POST_HOME=/mpi3/C3/elmer_6283/bin' 'JPK_INS=/mpi3/C3' 'ELMER_LIB=/mpi3/C3/elmer_6283/share/elmersolver/lib' 'HYDRA_PROXY_RETRY_COUNT=3' 'GDMSESSION=ubuntu' 'JPK_ELMER_S=/mpi3/S3/elmer_6283' 'JPK_LMETISDIR5=/mpi3/C3/parmetis-4.0.2' 'JPK_LMETISDIR32=/mpi3/C3/ParMetis-3.2.0' 'HYDRA_DEBUG=1' 'JPK_BUI=/mpi3/S3' 'VTK_INCLUDEPATH=/mpi3/C3/VTK-5.8.0/include' 'SHLVL=1' 'HOME=/home/joni' 'OMPI_CC=//mpi3/C3/mpich-3.0.4/bin/mpicc' 'LANGUAGE=fi:en' 'OMPI_90=//mpi3/C3/mpich-3.0.4/bin/mpif90' 'ELMERGUI_HOME=/mpi3/C3/elmer_6283/bin' 'GNOME_DESKTOP_SESSION_ID=this-is-deprecated' 'MPI_IMPLEMENTATION=mpich' 'MKL_SERIAL=YES' 'LOGNAME=joni' 'HYPRE=/mpi3/C3/hypre-2.8.0b' 'JPK_ARPACK_S=/mpi3/S3/ARPACK' 'JPK_JOBS=7' 'JPK_VTK_DIR=/mpi3/C3/VTK-5.8.0' 'SCALAPACK=/mpi3/C3/scalapack-2.0.2' 'XDG_DATA_DIRS=/usr/share/ubuntu:/usr/share/gnome:/usr/local/share/:/usr/share/' 'DBUS_SESSION_BUS_ADDRESS=unix:abstract=/tmp/dbus-gAYmjGaklf,guid=26c3f15a7a5ee55e8782415700000034' 'JPK_ARPACK=/mpi3/C3/ARPACK' 'MPI_HOME=/mpi3/C3/mpich-3.0.4' 'LESSOPEN=| /usr/bin/lesspipe %s' 'LACPACK=/mpi3/C3/acml5.3.1/gfortran64_mp' 'OMPI_CXX=//mpi3/C3/mpich-3.0.4/bin/mpicxx' 'OMP_NUM_THREADS=6' 'JPK_TOGL_S=/mpi3/S3/Togl-1.7' 'HYDRA_LAUNCHER_EXEC=/usr/bin/rsh-redone-rsh' 'JPK_MPICH2=/mpi3/C3/mpich-3.0.4' 'PARDISO=/mpi3/C3/pardiso' 'PARDISOLICMESSAGE=1' 'JPK_VER=C3' 'XDG_CURRENT_DESKTOP=Unity' 'LESSCLOSE=/usr/bin/lesspipe %s %s' 'LC_TIME=fi_FI.UTF-8' 'JPK_HYPRE_S=/mpi3/S3/hypre-2.8.0b' 'JPK_MPICH2_S=/mpi3/S3/mpich-3.0.4' 'COLORTERM=gnome-terminal' 'XAUTHORITY=/home/joni/.Xauthority' 'LC_NAME=fi_FI.UTF-8' '_=/mpi3/C3/mpich-3.0.4/bin/mpiexec' 'OLDPWD=/home/joni' --global-user-env 0 --global-system-env 2 'MPICH_ENABLE_CKPOINT=1' 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --genv-prop all --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2 --exec-local-env 0 --exec-wdir /mpi3/S3/mpich-3.0.4 --exec-args 1 hostname <br>
<div class="im">
<br>Arguments being passed to proxy 2:<br></div>--version 3.0.4 --iface-ip-env-name MPICH_INTERFACE_HOSTNAME --hostname 192.168.0.43 --global-core-map 0,2,6 --pmi-id-map 0,4 --global-process-count 6 --auto-cleanup 1 --pmi-kvsname kvs_12243_0 --pmi-process-mapping (vector,(0,3,2)) --ckpoint-prefix /mpi3/chekpoint/default.chk --ckpoint-num -1 --global-inherited-env 121 'MUMPS=/mpi3/S3/MUMPS_4.10.0' 'LC_PAPER=fi_FI.UTF-8' 'LC_ADDRESS=fi_FI.UTF-8' 'SSH_AGENT_PID=12144' 'LC_MONETARY=fi_FI.UTF-8' 'MUMPS_I=/mpi3/C3/MUMPS_4.10.0' 'HYDRA_DEMUX=select' 'GPG_AGENT_INFO=/tmp/keyring-kJwpJQ/gpg:0:1' 'JPK_LMETISDIR_S5=/mpi3/S3/parmetis-4.0.2' 'TERM=xterm' 'SHELL=/bin/bash' 'XDG_SESSION_COOKIE=6d6390cb56a32b6678c10da600000412-1377606907.629665-1922379047' 'FFT=/mpi3/C3/fftw2' 'HYDRA_ENV=all' 'JPK_NETGEN=/mpi3/C3/netgen_668' 'JPK_VER_S=S3' 'HYDRA_CKPOINTLIB=blcr' 'HYDRA_CKPOINT_INTERVAL=10800' 'WINDOWID=54602522' 'LC_NUMERIC=fi_FI.UTF-8' 'HYDRA_CKPOINT_PREFIX=/mpi3/chekpoint/default.chk' 'GNOME_KEYRING_CONTROL=/tmp/keyring-kJwpJQ' 'JPK_ELMER=/mpi3/C3/elmer_6283' 'PARDISO_LIC_PATH=/mpi3/C3/pardiso' 'METIS_INCLUDE_DIR=/mpi3/C3/ParMetis-3.2.0' 'JPK_NETGEN_S=/mpi3/S3/netgen_668' 'USER=joni' 'LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.axa=00;36:*.oga=00;36:*.spx=00;36:*.xspf=00;36:' 'JPK_TOGL=/mpi3/C3/Togl-1.7' 'LD_LIBRARY_PATH=/mpi3/C3/mpich-3.0.4/lib:/mpi3/C3/mpich-3.0.4/bin:/mpi3/C3/blcr-0.8.5/lib:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/scotch_6.0.0/lib:/mpi3/S3/MUMPS_4.10.0/lib:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/mpi3/C3/scalapack-2.0.2/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/pardiso:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/ARPACK:/mpi3/C3/hdf5-1.8.10-patch1/lib:/mpi3/C3/VTK-5.8.0/lib/vtk-5.8:/mpi3/C3/elmer_6283/lib:/mpi3/C3/Togl-1.7:/mpi3/C3/netgen_668/lib:/usr/lib/:/usr/local/lib://mpi3/C3/vrpn/lib://mpi3/C3/hidapi/lib:/usr/include/libusb-1.0' 'LC_TELEPHONE=fi_FI.UTF-8' 'XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session0' 'JPK_OCC=/usr/include/oce' 'XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0' 'HYDRA_HOST_FILE=/mpi4/hosts' 'SSH_AUTH_SOCK=/tmp/ssh-NnhxNTH12143/agent.12143' 'SCOTCHDIR=/mpi3/C3/scotch_6.0.0' 'HYDRA_LAUNCHER=rsh' 'JPK_VER_B=B3' 'SESSION_MANAGER=local/mpi1:@/tmp/.ICE-unix/4284,unix/mpi1:/tmp/.ICE-unix/4284' 'DEFAULTS_PATH=/usr/share/gconf/ubuntu.default.path' 'ELMER_HOME=/mpi3/C3/elmer_6283' 'BLACS=/mpi3/C3/scalapack-2.0.2' 'BLAS32=/mpi3/C3/acml5.3.1/gfortran64_mp' 'METIS_DIR=' 'MPI_LIBS=-L/mpi3/C3/mpich-3.0.4/lib -lmpich -lmpichf90 -lmpl -lopa -lmpichcxx' 'XDG_CONFIG_DIRS=/etc/xdg/xdg-ubuntu:/etc/xdg' 'JPK_MPI_DIR=/mpi3' 'JPK_HDF5_S=/mpi3/S3/hdf5-1.8.10-patch1' 'MPIEXEC_PORT_RANGE=7000:7500' 'PATH=/mpi3/C3/cmake-2.8.10.2/bin:/mpi3/C3/blcr-0.8.5/bin:/mpi3/C3/mpich-3.0.4/bin:/usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/mpi3/C3/elmer_6283/bin:/mpi3/C3/elmer_6283/lib:/mpi3/C3/ParaView3' 'DESKTOP_SESSION=ubuntu' 'BLAS=/mpi3/C3/acml5.3.1/gfortran64_mp' 'METIS_LIBDIR=/mpi3/C3/ParMetis-3.2.0' 'CMAKE_COMMAND=/mpi3/C3/cmake-2.8.10.2/bin' 'QT_QMAKE_EXECUTABLE=/usr/bin/qmake-qt4' 'LC_IDENTIFICATION=fi_FI.UTF-8' 'JPK_SCOTCHDIR_S=/mpi3/S3/scotch_6.0.0_esmumps' 'JPK_LMETISDIR_S=/mpi3/S3/ParMetis-3.2.0' 'PWD=/mpi3/S3/mpich-3.0.4' 'NETGENDIR=/mpi3/C3/netgen_668/bin' 'EDITOR=nano' 'JPK_LMETISDIR=/mpi3/C3/ParMetis-3.2.0' 'GNOME_KEYRING_PID=4273' 'LANG=fi_FI.UTF-8' 'MANDATORY_PATH=/usr/share/gconf/ubuntu.mandatory.path' 'OMPI_77=//mpi3/C3/mpich-3.0.4/bin/mpif77' 'LC_MEASUREMENT=fi_FI.UTF-8' 'JPK_HDF5=/mpi3/C3/hdf5-1.8.10-patch1' 'UBUNTU_MENUPROXY=libappmenu.so' 'COMPIZ_CONFIG_PROFILE=ubuntu' 'ELMER_POST_HOME=/mpi3/C3/elmer_6283/bin' 'JPK_INS=/mpi3/C3' 'ELMER_LIB=/mpi3/C3/elmer_6283/share/elmersolver/lib' 'HYDRA_PROXY_RETRY_COUNT=3' 'GDMSESSION=ubuntu' 'JPK_ELMER_S=/mpi3/S3/elmer_6283' 'JPK_LMETISDIR5=/mpi3/C3/parmetis-4.0.2' 'JPK_LMETISDIR32=/mpi3/C3/ParMetis-3.2.0' 'HYDRA_DEBUG=1' 'JPK_BUI=/mpi3/S3' 'VTK_INCLUDEPATH=/mpi3/C3/VTK-5.8.0/include' 'SHLVL=1' 'HOME=/home/joni' 'OMPI_CC=//mpi3/C3/mpich-3.0.4/bin/mpicc' 'LANGUAGE=fi:en' 'OMPI_90=//mpi3/C3/mpich-3.0.4/bin/mpif90' 'ELMERGUI_HOME=/mpi3/C3/elmer_6283/bin' 'GNOME_DESKTOP_SESSION_ID=this-is-deprecated' 'MPI_IMPLEMENTATION=mpich' 'MKL_SERIAL=YES' 'LOGNAME=joni' 'HYPRE=/mpi3/C3/hypre-2.8.0b' 'JPK_ARPACK_S=/mpi3/S3/ARPACK' 'JPK_JOBS=7' 'JPK_VTK_DIR=/mpi3/C3/VTK-5.8.0' 'SCALAPACK=/mpi3/C3/scalapack-2.0.2' 'XDG_DATA_DIRS=/usr/share/ubuntu:/usr/share/gnome:/usr/local/share/:/usr/share/' 'DBUS_SESSION_BUS_ADDRESS=unix:abstract=/tmp/dbus-gAYmjGaklf,guid=26c3f15a7a5ee55e8782415700000034' 'JPK_ARPACK=/mpi3/C3/ARPACK' 'MPI_HOME=/mpi3/C3/mpich-3.0.4' 'LESSOPEN=| /usr/bin/lesspipe %s' 'LACPACK=/mpi3/C3/acml5.3.1/gfortran64_mp' 'OMPI_CXX=//mpi3/C3/mpich-3.0.4/bin/mpicxx' 'OMP_NUM_THREADS=6' 'JPK_TOGL_S=/mpi3/S3/Togl-1.7' 'HYDRA_LAUNCHER_EXEC=/usr/bin/rsh-redone-rsh' 'JPK_MPICH2=/mpi3/C3/mpich-3.0.4' 'PARDISO=/mpi3/C3/pardiso' 'PARDISOLICMESSAGE=1' 'JPK_VER=C3' 'XDG_CURRENT_DESKTOP=Unity' 'LESSCLOSE=/usr/bin/lesspipe %s %s' 'LC_TIME=fi_FI.UTF-8' 'JPK_HYPRE_S=/mpi3/S3/hypre-2.8.0b' 'JPK_MPICH2_S=/mpi3/S3/mpich-3.0.4' 'COLORTERM=gnome-terminal' 'XAUTHORITY=/home/joni/.Xauthority' 'LC_NAME=fi_FI.UTF-8' '_=/mpi3/C3/mpich-3.0.4/bin/mpiexec' 'OLDPWD=/home/joni' --global-user-env 0 --global-system-env 2 'MPICH_ENABLE_CKPOINT=1' 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --genv-prop all --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2 --exec-local-env 0 --exec-wdir /mpi3/S3/mpich-3.0.4 --exec-args 1 hostname <br>
<div class="im">
<br>[mpiexec@mpi1] Launch arguments: /mpi3/C3/mpich-3.0.4/bin/hydra_pmi_proxy --control-port <a href="http://192.168.0.41:7001" target="_blank">192.168.0.41:7001</a> --debug --rmk user --launcher rsh --launcher-exec /usr/bin/rsh-redone-rsh --demux poll --pgid 0 --retries 3 --usize -2 --proxy-id 0 <br>

[mpiexec@mpi1] Launch arguments: /usr/bin/rsh-redone-rsh 192.168.0.42 "/mpi3/C3/mpich-3.0.4/bin/hydra_pmi_proxy" --control-port <a href="http://192.168.0.41:7001" target="_blank">192.168.0.41:7001</a> --debug --rmk user --launcher rsh --launcher-exec /usr/bin/rsh-redone-rsh --demux poll --pgid 0 --retries 3 --usize -2 --proxy-id 1 <br>
</div>
[mpiexec@mpi1] Launch arguments: /usr/bin/rsh-redone-rsh 192.168.0.43 "/mpi3/C3/mpich-3.0.4/bin/hydra_pmi_proxy" --control-port <a href="http://192.168.0.41:7001" target="_blank">192.168.0.41:7001</a> --debug --rmk user --launcher rsh --launcher-exec /usr/bin/rsh-redone-rsh --demux poll --pgid 0 --retries 3 --usize -2 --proxy-id 2 <br>

mpi1<br>m<br></div></div><div class="gmail_extra"><div><div class="h5"><br><br><div class="gmail_quote">2013/8/27 Pavan Balaji <span dir="ltr"><<a href="mailto:balaji@mcs.anl.gov" target="_blank">balaji@mcs.anl.gov</a>></span><br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
<br>
Please don't drop <a href="mailto:discuss@mpich.org" target="_blank">discuss@mpich.org</a> from the cc list.<br>
<br>
I doubt demux, --assert-level and blcr are relevant here.  Also, the output of "make testing" is not helpful for us because those tests can fail even if your machines are too slow.<br>
<br>
Did you try my suggestion from the previous email?  Could you try them and report back (just with that information)?<br>
<br>
 -- Pavan<div><br>
<br>
On 08/27/2013 09:35 AM, Joni-Pekka Kurronen wrote:<br>
</div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div>
<br>
hi,<br>
<br>
I have allraedy cheked up.<br>
This is not new install, just jumped up from mpich2 to 3 and<br>
gcc 4.6 to gcc 4.7.<br>
<br>
I have rsh-redone-rsh as main but tested ssh as well,...<br>
due at crash clinet's keep running at slave's and useing harddisk,...<br>
<br>
Following work's:<br>
any machine alone<br>
mpi1 and kaak  or  mpi1 and ugh<br>
but not all to gether except hostname,...<br>
<br>
This could be realated:<br>
- demux  ( have tried select and poll, whit poll have to restart slave<br>
machine's)<br>
- nfs4 ( some reason nfs4 must be manually mounted at moment after<br>
restart at slave's)<br>
- have changed --assert-level to 0 (default 2)<br>
- blcr<br>
<br>
<br>
ch3:socket setting:<br>
<br>
=============<br>
hosts file,..<br>
</div><a href="http://192.168.0.41:2" target="_blank">192.168.0.41:2</a> <<a href="http://192.168.0.41:2" target="_blank">http://192.168.0.41:2</a>><br>
<a href="http://192.168.0.42:2" target="_blank">192.168.0.42:2</a> <<a href="http://192.168.0.42:2" target="_blank">http://192.168.0.42:2</a>><br>
#<a href="http://192.168.0.43:2" target="_blank">192.168.0.43:2</a> <<a href="http://192.168.0.43:2" target="_blank">http://192.168.0.43:2</a>><div><div><br>
=============<br>
summary.xml errors<br>
<MPITEST><br>
<NAME>spawninfo1</NAME><br>
<NP>1</NP><br>
<WORKDIR>./spawn</WORKDIR><br>
<STATUS>fail</STATUS><br>
<TESTDIFF><br>
[mpiexec@mpi1] APPLICATION TIMED OUT<br>
[proxy:0:0@mpi1] HYD_pmcd_pmip_control_cmd_cb<br>
(./pm/pmiserv/pmip_cb.c:886): assert (!closed) failed<br>
[proxy:0:0@mpi1] HYDT_dmxu_poll_wait_for_event<br>
(./tools/demux/demux_poll.c:<u></u>77): callback returned error status<br>
[proxy:0:0@mpi1] main (./pm/pmiserv/pmip.c:206): demux engine error<br>
waiting for event<br>
[proxy:1:0@mpi1] HYD_pmcd_pmip_control_cmd_cb<br>
(./pm/pmiserv/pmip_cb.c:886): assert (!closed) failed<br>
[proxy:1:0@mpi1] HYDT_dmxu_poll_wait_for_event<br>
(./tools/demux/demux_poll.c:<u></u>77): callback returned error status<br>
[proxy:1:0@mpi1] main (./pm/pmiserv/pmip.c:206): demux engine error<br>
waiting for event<br>
[mpiexec@mpi1] HYDT_bscu_wait_for_completion<br>
(./tools/bootstrap/utils/bscu_<u></u>wait.c:76): one of the processes<br>
terminated badly; aborting<br>
[mpiexec@mpi1] HYDT_bsci_wait_for_completion<br>
(./tools/bootstrap/src/bsci_<u></u>wait.c:23): launcher returned error waiting<br>
for completion<br>
[mpiexec@mpi1] HYD_pmci_wait_for_completion<br>
(./pm/pmiserv/pmiserv_pmci.c:<u></u>188): launcher returned error waiting for<br>
completion<br>
[mpiexec@mpi1] main (./ui/mpich/mpiexec.c:331): process manager error<br>
waiting for completion<br>
</TESTDIFF><br>
</MPITEST><br>
<MPITEST><br>
<NAME>rdwrord</NAME><br>
<NP>4</NP><br>
<WORKDIR>./io</WORKDIR><br>
<STATUS>fail</STATUS><br>
<TESTDIFF><br>
Fatal error in PMPI_Bcast: Other MPI error<br>
[cli_2]: aborting job:<br>
Fatal error in PMPI_Bcast: Other MPI error<br>
Fatal error in PMPI_Bcast: Other MPI error<br>
[cli_0]: aborting job:<br>
Fatal error in PMPI_Bcast: Other MPI error<br>
<br>
==============================<u></u>==============================<u></u>=======================<br>
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES<br>
=   EXIT CODE: 1<br>
=   CLEANING UP REMAINING PROCESSES<br>
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES<br>
==============================<u></u>==============================<u></u>=======================<br>
</TESTDIFF><br>
</MPITEST><br>
<MPITEST><br>
<NAME>rdwrzero</NAME><br>
<NP>4</NP><br>
<WORKDIR>./io</WORKDIR><br>
<STATUS>fail</STATUS><br>
<TESTDIFF><br>
Fatal error in PMPI_Bcast: Other MPI error<br>
[cli_2]: aborting job:<br>
Fatal error in PMPI_Bcast: Other MPI error<br>
Fatal error in PMPI_Bcast: Other MPI error<br>
[cli_0]: aborting job:<br>
Fatal error in PMPI_Bcast: Other MPI error<br>
<br>
==============================<u></u>==============================<u></u>=======================<br>
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES<br>
=   EXIT CODE: 1<br>
=   CLEANING UP REMAINING PROCESSES<br>
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES<br>
==============================<u></u>==============================<u></u>=======================<br>
</TESTDIFF><br>
</MPITEST><br>
<MPITEST><br>
<NAME>getextent</NAME><br>
<NP>2</NP><br>
<WORKDIR>./io</WORKDIR><br>
<STATUS>pass</STATUS><br>
</MPITEST><br>
<MPITEST><br>
<NAME>setinfo</NAME><br>
<NP>4</NP><br>
<WORKDIR>./io</WORKDIR><br>
<STATUS>fail</STATUS><br>
<TESTDIFF><br>
Fatal error in PMPI_Bcast: Other MPI error<br>
[cli_2]: aborting job:<br>
Fatal error in PMPI_Bcast: Other MPI error<br>
Fatal error in PMPI_Bcast: Other MPI error<br>
[cli_0]: aborting job:<br>
Fatal error in PMPI_Bcast: Other MPI error<br>
<br>
==============================<u></u>==============================<u></u>=======================<br>
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES<br>
=   EXIT CODE: 1<br>
=   CLEANING UP REMAINING PROCESSES<br>
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES<br>
==============================<u></u>==============================<u></u>=======================<br>
</TESTDIFF><br>
</MPITEST><br>
<MPITEST><br>
<NAME>setviewcur</NAME><br>
<NP>4</NP><br>
<WORKDIR>./io</WORKDIR><br>
<STATUS>fail</STATUS><br>
<TESTDIFF><br>
Fatal error in PMPI_Bcast: Other MPI error<br>
[cli_2]: aborting job:<br>
Fatal error in PMPI_Bcast: Other MPI error<br>
Fatal error in PMPI_Bcast: Other MPI error<br>
[cli_0]: aborting job:<br>
Fatal error in PMPI_Bcast: Other MPI error<br>
<br>
==============================<u></u>==============================<u></u>=======================<br>
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES<br>
=   EXIT CODE: 1<br>
=   CLEANING UP REMAINING PROCESSES<br>
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES<br>
==============================<u></u>==============================<u></u>=======================<br>
</TESTDIFF><br>
</MPITEST><br>
<br>
....<br>
....<br>
....<br>
<br>
<br>
============<br>
============<br>
hosts file,..<br>
</div></div><a href="http://192.168.0.41:2" target="_blank">192.168.0.41:2</a> <<a href="http://192.168.0.41:2" target="_blank">http://192.168.0.41:2</a>><br>
<a href="http://192.168.0.42:2" target="_blank">192.168.0.42:2</a> <<a href="http://192.168.0.42:2" target="_blank">http://192.168.0.42:2</a>><br>
<a href="http://192.168.0.43:2" target="_blank">192.168.0.43:2</a> <<a href="http://192.168.0.43:2" target="_blank">http://192.168.0.43:2</a>><div><div><br>
============<br>
============<br>
<br>
When needed more than 4 process will hang,..<br>
<br>
Unexpected output in allred3: [mpiexec@mpi1] APPLICATION TIMED OUT<br>
Unexpected output in allred3: [proxy:0:0@mpi1]<br>
HYD_pmcd_pmip_control_cmd_cb (./pm/pmiserv/pmip_cb.c:886): assert<br>
(!closed) failed<br>
Unexpected output in allred3: [proxy:0:0@mpi1]<br>
HYDT_dmxu_poll_wait_for_event (./tools/demux/demux_poll.c:<u></u>77): callback<br>
returned error status<br>
Unexpected output in allred3: [proxy:0:0@mpi1] main<br>
(./pm/pmiserv/pmip.c:206): demux engine error waiting for event<br>
<br>
======<br>
<br>
if useing ch3:nemesis hardisk is not running all the time as whit<br>
sockets's,...<br>
<br>
<br>
<br>
==============================<u></u>==================<br>
This is read by rsh-redone-rsh for every process,...<br>
==============================<u></u>==================<br>
#!/bin/bash<br>
<br>
# JPK-Integration for Ubuntu 12.4 LTS<br>
#<br>
<a href="https://sites.google.com/site/jpsdatareviewstheboy007/ubuntu-lts-12-4-companion-whit-ltsp-mpich2-elmer-openfoam" target="_blank">https://sites.google.com/site/<u></u>jpsdatareviewstheboy007/<u></u>ubuntu-lts-12-4-companion-<u></u>whit-ltsp-mpich2-elmer-<u></u>openfoam</a><br>


#<br>
# CMAKE goes loop and can not build, cmake build under development says<br>
documentation<br>
<br>
# gcc 4.7<br>
# bdver1 optimization<br>
<br>
shopt -s expand_aliases<br>
export JPK_MPI_DIR=/mpi3         # MAIN DIRECTORY, SUBDIRCTORYES:<br>
export JPK_VER=C3                # BINARY CODE<br>
export JPK_VER_S=S3              # SOURCE CODE<br>
export JPK_VER_B=B3              # BASH FILES TO COMPILE AND CONFIGURE<br>
export JPK_INS=$JPK_MPI_DIR/$JPK_VER<br>
export JPK_BUI=$JPK_MPI_DIR/$JPK_VER_<u></u>S<br>
export JPK_ELMER=$JPK_INS/elmer_6283 #035<br>
export JPK_ELMER_S=$JPK_BUI/elmer_<u></u>6283<br>
export JPK_NETGEN_S=$JPK_BUI/netgen_<u></u>668<br>
export JPK_NETGEN=$JPK_INS/netgen_668<br>
<br>
#GCC<br>
#export JPK_FLAGS="-Wl,--no-as-needed -fPIC -DAdd_ -m64 -pthread -O3<br>
-fopenmp -lgomp -march=bdver1 -ftree-vectorize -funroll-loops"<br>
#export CFLAGS="-Wl,--no-as-needed -fPIC -DAdd_ -m64 -pthread -fopenmp<br>
-lgomp"<br>
<br>
# M A K E<br>
<br>
export JPK_JOBS=7<br>
<br>
# O P E N  MP<br>
export OMP_NUM_THREADS=6<br>
<br>
<br>
# M P I C 3<br>
# <a href="http://wiki.mcs.anl.gov/mpich2/index.php/Using_the_Hydra_Process_Manager" target="_blank">http://wiki.mcs.anl.gov/<u></u>mpich2/index.php/Using_the_<u></u>Hydra_Process_Manager</a><br>
export JPK_MPICH2_S=$JPK_BUI/mpich-3.<u></u>0.4<br>
export JPK_MPICH2=$JPK_INS/mpich-3.0.<u></u>4<br>
export PATH=$JPK_MPICH2/bin:$PATH<br>
export MPI_HOME=$JPK_MPICH2<br>
export MPI_LIBS="-L$JPK_MPICH2/lib -lmpich -lmpichf90 -lmpl -lopa<br>
-lmpichcxx"<br>
export LD_LIBRARY_PATH=$JPK_MPICH2/<u></u>lib:$JPK_MPICH2/bin # FIRST<br>
<br>
# M P I<br>
<br>
export MPI_IMPLEMENTATION=mpich<br>
<br>
export OMPI_CC=/$JPK_MPI_DIR/$JPK_<u></u>VER/mpich-3.0.4/bin/mpicc<br>
export OMPI_CXX=/$JPK_MPI_DIR/$JPK_<u></u>VER/mpich-3.0.4/bin/mpicxx<br>
export OMPI_77=/$JPK_MPI_DIR/$JPK_<u></u>VER/mpich-3.0.4/bin/mpif77<br>
export OMPI_90=/$JPK_MPI_DIR/$JPK_<u></u>VER/mpich-3.0.4/bin/mpif90<br>
<br>
# <a href="http://wiki.mcs.anl.gov/mpich2/index.php/Using_the_Hydra_Process_Manager" target="_blank">http://wiki.mcs.anl.gov/<u></u>mpich2/index.php/Using_the_<u></u>Hydra_Process_Manager</a><br>
export HYDRA_DEBUG=0<br>
export HYDRA_HOST_FILE=/mpi4/hosts<br>
export HYDRA_LAUNCHER=rsh<br>
#export HYDRA_LAUNCHER=ssh<br>
#export HYDRA_LAUNCHER_EXEC=/usr/bin/<u></u>netkit-rsh<br>
export HYDRA_LAUNCHER_EXEC=/usr/bin/<u></u>rsh-redone-rsh<br>
#export HYDRA_LAUNCHER_EXEC=/usr/bin/<u></u>ssh<br>
export HYDRA_DEMUX=select<br>
#export HYDRA_DEMUX=select #more porseses than core's<br>
export HYDRA_PROXY_RETRY_COUNT=3<br>
#export HYDRA_RMK=pbs<br>
#export HYDRA_DEFAULT_RMK=pbs<br>
export HYDRA_ENV=all<br>
export MPIEXEC_PORT_RANGE=7000:7500<br>
#mpirun -launcher rsh -launcher-exec /usr/bin/netkit-rsh -demux select<br>
-n 21 ddd ./cpi<br>
<br>
# b l c r<br>
<br>
export HYDRA_CKPOINTLIB=blcr<br>
export HYDRA_CKPOINT_PREFIX=/mpi3/<u></u>chekpoint/default.chk<br>
export HYDRA_CKPOINT_INTERVAL=10800<br>
export PATH=$JPK_INS/blcr-0.8.5/bin:$<u></u>PATH<br>
export LD_LIBRARY_PATH=$LD_LIBRARY_<u></u>PATH:$JPK_INS/blcr-0.8.5/lib<br>
#-ckpoint-num 5<br>
<br>
# C M A K E - BUILD<br>
export PATH=$JPK_INS/cmake-2.8.10.2/<u></u>bin:$PATH<br>
export CMAKE_COMMAND=$JPK_INS/cmake-<u></u><a href="http://2.8.10.2/bin" target="_blank">2.8.10.2/bin</a><br>
<br>
# T O G L - UI netgen<br>
<br>
export JPK_TOGL="$JPK_INS/Togl-1.7"<br>
export JPK_TOGL_S="$JPK_BUI/Togl-1.7"<br>
<br>
# OCC<br>
export JPK_OCC=/usr/include/oce<br>
<br>
# M A T H<br>
<br>
export JPK_ARPACK_S=$JPK_BUI/ARPACK<br>
export JPK_ARPACK=$JPK_INS/ARPACK<br>
<br>
export BLAS=$JPK_INS/acml5.3.1/<u></u>gfortran64_mp<br>
export BLAS32=$JPK_INS/acml5.3.1/<u></u>gfortran64_mp<br>
#export BLAS=$JPK_INS/clAmdBlas-1.10.<u></u>321/lib64<br>
#export BLAS32=$JPK_INS/clAmdBlas-1.<u></u>10.321/include<br>
export FFT=$JPK_INS/fftw2<br>
export LACPACK=$BLAS<br>
export SCALAPACK=$JPK_INS/scalapack-<u></u>2.0.2<br>
export BLACS=$SCALAPACK<br>
<br>
export JPK_LMETISDIR_S=$JPK_BUI/<u></u>ParMetis-3.2.0<br>
export JPK_LMETISDIR=$JPK_INS/<u></u>ParMetis-3.2.0<br>
export JPK_LMETISDIR32=$JPK_LMETISDIR<br>
export JPK_LMETISDIR_S5=$JPK_BUI/<u></u>parmetis-4.0.2<br>
export JPK_LMETISDIR5=$JPK_INS/<u></u>parmetis-4.0.2<br>
<br>
export METIS_DIR="" #$JPK_LMETISDIR MUST BE EMPTY<br>
export METIS_INCLUDE_DIR=$JPK_<u></u>LMETISDIR<br>
export METIS_LIBDIR=$JPK_LMETISDIR<br>
export LD_LIBRARY_PATH=$LD_LIBRARY_<u></u>PATH:$JPK_LMETISDIR:$BLAS/lib:<u></u>$FTT/lib<br>
#/mpi4/S/metis-5.0.2/GKlib<br>
<br>
export SCOTCHDIR=$JPK_INS/scotch_6.0.<u></u>0<br>
export JPK_SCOTCHDIR_S=$JPK_BUI/<u></u>scotch_6.0.0_esmumps<br>
<br>
export MUMPS_I=$JPK_INS/MUMPS_4.10.0<br>
export MUMPS=$JPK_BUI/MUMPS_4.10.0<br>
<br>
export HYPRE=$JPK_INS/hypre-2.8.0b<br>
export LD_LIBRARY_PATH=$LD_LIBRARY_<u></u>PATH:$HYPRE/lib<br>
export JPK_HYPRE_S=$JPK_BUI/hypre-2.<u></u>8.0b<br>
<br>
export PARDISOLICMESSAGE=1<br>
export PARDISO=$JPK_INS/pardiso<br>
export PARDISO_LIC_PATH=$PARDISO<br>
export MKL_SERIAL=YES<br>
<br>
export<br>
LD_LIBRARY_PATH=$LD_LIBRARY_<u></u>PATH:$SCOTCHDIR/lib:$MUMPS/<u></u>lib:$BLAS/lib:$SCALAPACK/lib:$<u></u>HYPRE/lib:$PARDISO:$METIS_<u></u>LIBDIR:$JPK_ARPACK<br>
<br>
#HDF5<br>
#export JPK_HDF5_S=$JPK_BUI/hdf5-1.8.<u></u>10-patch1 for vtk testing<br>
#export JPK_HDF5=$JPK_INS/hdf5-1.8.10-<u></u>patch1<br>
export JPK_HDF5_S=$JPK_BUI/hdf5-1.8.<u></u>10-patch1<br>
export JPK_HDF5=$JPK_INS/hdf5-1.8.10-<u></u>patch1<br>
export LD_LIBRARY_PATH=$LD_LIBRARY_<u></u>PATH:$JPK_HDF5/lib<br>
<br>
# V T K<br>
export JPK_VTK_DIR=$JPK_INS/VTK-5.8.0<br>
export LD_LIBRARY_PATH=$LD_LIBRARY_<u></u>PATH:$JPK_VTK_DIR/lib/vtk-5.8<br>
export VTK_INCLUDEPATH=$JPK_VTK_DIR/<u></u>include<br>
<br>
# Q T<br>
export QT_QMAKE_EXECUTABLE=/usr/bin/<u></u>qmake-qt4<br>
<br>
# O P E N    F O A M<br>
# <a href="http://www.openfoam.org/download/source.php" target="_blank">http://www.openfoam.org/<u></u>download/source.php</a><br>
<br>
#export WM_SCHEDULER=wmakeScheduler<br></div></div>
#export WM_HOSTS="<a href="http://192.168.0.41:6" target="_blank">192.168.0.41:6</a> <<a href="http://192.168.0.41:6" target="_blank">http://192.168.0.41:6</a>> <a href="http://192.168.0.42:6" target="_blank">192.168.0.42:6</a><br>


<<a href="http://192.168.0.42:6" target="_blank">http://192.168.0.42:6</a>> <a href="http://192.168.0.43:6" target="_blank">192.168.0.43:6</a> <<a href="http://192.168.0.43:6" target="_blank">http://192.168.0.43:6</a>>"<div>

<div><br>
#export WM_NCOMPPROCS=$($WM_SCHEDULER -count)<br>
#export WM_COLOURS="black blue green cyan red magenta yellow"<br>
<br>
#export FOAM_INST_DIR=/mpi2/OpenFOAM<br>
#foamDotFile=$FOAM_INST_DIR/<u></u>OpenFOAM-2.1.x/etc/bashrc<br>
#[ -f $foamDotFile ] && . $foamDotFile<br>
#source /mpi3/OpenFOAM/OpenFOAM-2.1.x/<u></u>etc/bashrc<br>
<br>
#export FOAM_RUN=/mpi2/om<br>
#export OpenBIN=/mpi2/OpenFOAM/<u></u>OpenFOAM-2.1.x/bin/tools<br>
#export PATH=OpenBIN$:$PATH<br>
#export LD_LIBRARY_PATH=$LD_LIBRARY_<u></u>PATH:/apps/OpenFOAM/tools/<u></u>lib64<br>
<br>
#export<br>
ParaView_DIR=/mpi2/OpenFOAM/<u></u>ThirdParty-2.1.x/platforms/<u></u>linux64Gcc/paraview-3.12.0<br>
#export PATH=$ParaView_DIR/bin:$PATH<br>
#export PV_PLUGIN_PATH=$FOAM_LIBBIN/<u></u>paraview-3.12<br>
<br>
# E L M E R<br>
export ELMER_HOME=$JPK_ELMER<br>
export ELMER_LIB=$JPK_ELMER/share/<u></u>elmersolver/lib<br>
export PATH=$PATH:$ELMER_HOME/bin:$<u></u>ELMER_HOME/lib<br>
export LD_LIBRARY_PATH=$LD_LIBRARY_<u></u>PATH:$ELMER_HOME/lib<br>
export ELMERGUI_HOME=$ELMER_HOME/bin<br>
export ELMER_POST_HOME=$ELMER_HOME/<u></u>bin<br>
<br>
# S a l o m é<br>
#cd /mpi2/salome-meca/SALOME-MECA-<u></u>2012.2-LGPL ; source envSalomeMeca.sh<br>
#cd ~/<br>
<br>
# Paraview<br>
#export PATH=$PATH:$JPK_INS/ParaView-<u></u>3.14.1-Linux-64bit<br>
export PATH=$PATH:$JPK_INS/ParaView3<br>
<br>
# N E T G E N   P A R A L L E L $JPK_TCL/lib:$JPK_TK/lib:<br>
#export<br>
LD_LIBRARY_PATH=$LD_LIBRARY_<u></u>PATH:$JPK_TOGL:$JPK_NETGEN\<u></u>par/lib:/usr/lib/<br>
#export NETGENDIR=$JPK_NETGEN\par/bin<br>
# NETGEN<br>
export LD_LIBRARY_PATH=$LD_LIBRARY_<u></u>PATH:$JPK_TOGL:$JPK_NETGEN/<u></u>lib:/usr/lib/<br>
export NETGENDIR=$JPK_NETGEN/bin<br>
<br>
#crontab, ext editor<br>
export EDITOR=nano<br>
<br>
#space ball<br>
export LD_LIBRARY_PATH=$LD_LIBRARY_<u></u>PATH:/usr/local/lib<br>
<br>
#vrpn & hidapi<br>
export<br>
LD_LIBRARY_PATH=$LD_LIBRARY_<u></u>PATH:/$JPK_MPI_DIR/$JPK_VER/<u></u>vrpn/lib:/$JPK_MPI_DIR/$JPK_<u></u>VER/hidapi/lib:/usr/include/<u></u>libusb-1.0<br>
<br>
<br>
<br>
<br>
<br></div></div>
2013/8/27 Pavan Balaji <<a href="mailto:balaji@mcs.anl.gov" target="_blank">balaji@mcs.anl.gov</a> <mailto:<a href="mailto:balaji@mcs.anl.gov" target="_blank">balaji@mcs.anl.gov</a>>><div><div><br>

<br>
<br>
    This is almost certainly a network issue with your third machine<br>
    (kaak, I presume?).<br>
<br>
    Thanks for making sure "hostname" works fine on all machines.  That<br>
    means that your ssh connections are setup correctly.  But a non-MPI<br>
    program, such as hostname, does not check the connection from kaak<br>
    back to mpi1.<br>
<br>
    Can you try a simple program like "examples/cpi" in the build<br>
    directory on all machines?  Try it on 2 machines (mpiexec -np 4) and<br>
    3 machines (mpiexec -np 6).<br>
<br>
    If the third machine is in fact having problems running the application:<br>
<br>
    1. Make sure there's no firewall on the third machines.<br>
<br>
    2. Make sure the /etc/hosts file is consistent on both the machines<br>
    (mpi1 and kaak).<br>
<br>
      -- Pavan<br>
<br>
<br>
    On 08/27/2013 06:46 AM, Joni-Pekka Kurronen wrote:<br>
<br>
<br>
        I have:<br>
        -Ubuntu 12.4<br>
        -rsh-redo-rsh<br>
        -three machines<br>
        -mpich3<br>
        -have tried export HYDRA_DEMUX=select / poll<br>
        -have tried ssh/rsh<br>
        -have added to LIBS: event_core event_pthreads<br>
<br>
        I can run test at on to two machines whitout error but<br>
        when I take third machine to cluster demux engine goes mad,...<br>
           there is connection hanging,... and nothing happens,...<br>
<br>
<br>
        <MPITEST><br>
        <NAME>uoplong</NAME><br>
        <NP>11</NP><br>
        <WORKDIR>./coll</WORKDIR><br>
        <STATUS>fail</STATUS><br>
        <TESTDIFF><br>
        [mpiexec@mpi1] APPLICATION TIMED OUT<br>
        [proxy:0:0@mpi1] HYD_pmcd_pmip_control_cmd_cb<br>
        (./pm/pmiserv/pmip_cb.c:886): assert (!closed) failed<br>
        [proxy:0:0@mpi1] HYDT_dmxu_poll_wait_for_event<br></div></div>
        (./tools/demux/demux_poll.c:__<u></u>77): callback returned error status<div><br>
        [proxy:0:0@mpi1] main (./pm/pmiserv/pmip.c:206): demux engine error<br>
        waiting for event<br>
        [mpiexec@mpi1] HYDT_bscu_wait_for_completion<br></div>
        (./tools/bootstrap/utils/bscu_<u></u>__wait.c:76): one of the processes<div><br>
        terminated badly; aborting<br>
        [mpiexec@mpi1] HYDT_bsci_wait_for_completion<br></div>
        (./tools/bootstrap/src/bsci___<u></u>wait.c:23): launcher returned<div><br>
        error waiting<br>
        for completion<br>
        [mpiexec@mpi1] HYD_pmci_wait_for_completion<br></div>
        (./pm/pmiserv/pmiserv_pmci.c:_<u></u>_188): launcher returned error<div><br>
        waiting for<br>
        completion<br>
        [mpiexec@mpi1] main (./ui/mpich/mpiexec.c:331): process manager<br>
        error<br>
        waiting for completion<br>
        </TESTDIFF><br>
        </MPITEST><br>
<br>
        Also I can run<br>
        joni@mpi1:/mpi3/S3/hpcc-1.4.2$ mpiexec -np 6 hostname<br>
        mpi1<br>
        mpi1<br>
        ugh<br>
        ugh<br>
        kaak<br>
        kaak<br>
<br>
        but if I run<br>
        joni@mpi1:/mpi3/S3/hpcc-1.4.2$ mpiexec -np 6 ls<br>
        I get only one directory as output and<br>
        system will cease until I have re-started slave machines !<br>
<br>
<br>
<br>
<br>
<br>
    --<br>
    Pavan Balaji<br>
    <a href="http://www.mcs.anl.gov/~balaji" target="_blank">http://www.mcs.anl.gov/~balaji</a><br>
<br>
<br>
<br>
<br>
--<br>
Joni-Pekka Kurronen<br>
</div><a href="mailto:Joni.Kurronen@gmail.com" target="_blank">Joni.Kurronen@gmail.com</a> <mailto:<a href="mailto:Joni.Kurronen@gmail.com" target="_blank">Joni.Kurronen@gmail.<u></u>com</a>><div><br>
gsm. <a href="tel:%2B358%2050%20521%202279" value="+358505212279" target="_blank">+358 50 521 2279</a><br>
</div></blockquote>
<br><div><div>
-- <br>
Pavan Balaji<br>
<a href="http://www.mcs.anl.gov/~balaji" target="_blank">http://www.mcs.anl.gov/~balaji</a><br>
</div></div></blockquote></div><br><br clear="all"><br></div></div><span class="HOEnZb"><font color="#888888">-- <br>Joni-Pekka Kurronen</font></span><div class="im"><br><a href="mailto:Joni.Kurronen@gmail.com" target="_blank">Joni.Kurronen@gmail.com</a><br>
gsm. <a href="tel:%2B358%2050%20521%202279" value="+358505212279" target="_blank">+358 50 521 2279</a>
</div></div>
</blockquote></div><br><br clear="all"><br>-- <br>Joni-Pekka Kurronen<br><a href="mailto:Joni.Kurronen@gmail.com">Joni.Kurronen@gmail.com</a><br>gsm. +358 50 521 2279
</div>