[mpich-discuss] ./pm/pmiserv/pmip_cb.c:886): assert (!closed) failed

Joni-Pekka Kurronen joni.kurronen at gmail.com
Wed Aug 28 03:26:20 CDT 2013


hi,

I can get time to time 6 processs running and sometime it crash when 5
started,...
so takeing third machine, what so ever start's the problem,... it's question
communication between two slave's,... and  crash when two salves are
started to work together  mpirin -np 2 -hosts ugh,kaak ls

File rights should be  ok, ports are open, useing rsh,....

Communication, demux engine ?
Could PAM controll affect ?
Still wondering NFS4 ?

joni

hosts
======
mpi1:2
ugh:2
kaak:2

Below error's debug after crash
======================
mpiexec -np 5 hostname
....
[mpiexec at mpi1] Launch arguments: /mpi3/C3/mpich-3.0.4/bin/hydra_pmi_proxy
--control-port 192.168.0.41:7000 --debug --rmk user --launcher rsh
--launcher-exec /usr/bin/rsh-redone-rsh --demux poll --pgid 0 --retries 2
--usize -2 --proxy-id 0
[mpiexec at mpi1] Launch arguments: /usr/bin/rsh-redone-rsh 192.168.0.42
"/mpi3/C3/mpich-3.0.4/bin/hydra_pmi_proxy" --control-port
192.168.0.41:7000--debug --rmk user --launcher rsh --launcher-exec
/usr/bin/rsh-redone-rsh
--demux poll --pgid 0 --retries 2 --usize -2 --proxy-id 1
[mpiexec at mpi1] Launch arguments: /usr/bin/rsh-redone-rsh 192.168.0.43
"/mpi3/C3/mpich-3.0.4/bin/hydra_pmi_proxy" --control-port
192.168.0.41:7000--debug --rmk user --launcher rsh --launcher-exec
/usr/bin/rsh-redone-rsh
--demux poll --pgid 0 --retries 2 --usize -2 --proxy-id 2
mpi1
mpi1
^X^C[mpiexec at mpi1] Sending Ctrl-C to processes as requested
[mpiexec at mpi1] Press Ctrl-C again to force abort
[mpiexec at mpi1] HYDU_sock_write (./utils/sock/sock.c:291): write error (Bad
file descriptor)
[mpiexec at mpi1] HYD_pmcd_pmiserv_send_signal
(./pm/pmiserv/pmiserv_cb.c:170): unable to write data to proxy
[mpiexec at mpi1] ui_cmd_cb (./pm/pmiserv/pmiserv_pmci.c:79): unable to send
signal downstream
[mpiexec at mpi1] HYDT_dmxu_poll_wait_for_event
(./tools/demux/demux_poll.c:77): callback returned error status
[mpiexec at mpi1] HYD_pmci_wait_for_completion
(./pm/pmiserv/pmiserv_pmci.c:197): error waiting for event
[mpiexec at mpi1] main (./ui/mpich/mpiexec.c:331): process manager error
waiting for completion
joni at mpi1:~$ echo $HYDRA_DEMUX
select
joni at mpi1:~$ sudo killall rsh-redone-rsh
[sudo] password for joni:
rsh-redone-rsh: ei prosesseja
joni at mpi1:~$


Below blcr test after re-compile
======================
root at mpi1:/mpi3/S3/blcr-0.8.5# master kaak errors:

34 child 21967 completed
#ST_ALARM:60
035 child 22072 is READY (context=SIGNAL stopped=YES)
036 child 22072 is STOPped
!!! Alarm clock expired
!!! Missing final DONE
!!! Test killed unexpectedly by signal 9
FAIL: stopped.st


root at ug:/mpi3/S3/blcr-0.8.5# slave kaak errors:


root at kaak:/mpi3/S3/blcr-0.8.5# slave kaak errors:


make check

34 child 7472 completed
#ST_ALARM:60
035 child 7577 is READY (context=SIGNAL stopped=YES)
036 child 7577 is STOPped
!!! Alarm clock expired
!!! Missing final DONE
!!! Test killed unexpectedly by signal 9
FAIL: stopped.st
PASS: edeadlk.st

/mpi3/S3/blcr-0.8.5/tests/.libs/lt-filedescriptors[8051]: file
"filedescriptors.c", line 270, in check_stat_simple: File attributes
changed.  1 mismatches
/mpi3/S3/blcr-0.8.5/tests/.libs/lt-filedescriptors[8051]: file "crut.c",
line 615, in crut_main: test_restart() unexpectedly returned -1
restart/nonzeroexit (255)
FAIL: filedescriptors.ct





2013/8/27 Joni-Pekka Kurronen <joni.kurronen at gmail.com>

>
>
> Another:
>
> joni at mpi1:/mpi3/S3/mpich-3.0.4$ mpiexec -np 6 hostname
> host: 192.168.0.41
> host: 192.168.0.42
> host: 192.168.0.43
>
>
> ==================================================================================================
> mpiexec options:
> ----------------
>   Base path: /mpi3/C3/mpich-3.0.4/bin/
>   Launcher: (null)
>   Debug level: 1
>   Enable X: -1
>
>   Global environment:
>   -------------------
>
>     MUMPS=/mpi3/S3/MUMPS_4.10.0
>     LC_PAPER=fi_FI.UTF-8
>     LC_ADDRESS=fi_FI.UTF-8
>     SSH_AGENT_PID=12144
>     LC_MONETARY=fi_FI.UTF-8
>     MUMPS_I=/mpi3/C3/MUMPS_4.10.0
>     HYDRA_DEMUX=select
>     GPG_AGENT_INFO=/tmp/keyring-kJwpJQ/gpg:0:1
>     JPK_LMETISDIR_S5=/mpi3/S3/parmetis-4.0.2
>     TERM=xterm
>     SHELL=/bin/bash
>
> XDG_SESSION_COOKIE=6d6390cb56a32b6678c10da600000412-1377606907.629665-1922379047
>     FFT=/mpi3/C3/fftw2
>     HYDRA_ENV=all
>     JPK_NETGEN=/mpi3/C3/netgen_668
>     JPK_VER_S=S3
>     HYDRA_CKPOINTLIB=blcr
>     HYDRA_CKPOINT_INTERVAL=10800
>     WINDOWID=54602522
>     LC_NUMERIC=fi_FI.UTF-8
>     HYDRA_CKPOINT_PREFIX=/mpi3/chekpoint/default.chk
>     GNOME_KEYRING_CONTROL=/tmp/keyring-kJwpJQ
>     JPK_ELMER=/mpi3/C3/elmer_6283
>     PARDISO_LIC_PATH=/mpi3/C3/pardiso
>     METIS_INCLUDE_DIR=/mpi3/C3/ParMetis-3.2.0
>     JPK_NETGEN_S=/mpi3/S3/netgen_668
>     USER=joni
>
> LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.axa=00;36:*.oga=00;36:*.spx=00;36:*.xspf=00;36:
>     JPK_TOGL=/mpi3/C3/Togl-1.7
>
> LD_LIBRARY_PATH=/mpi3/C3/mpich-3.0.4/lib:/mpi3/C3/mpich-3.0.4/bin:/mpi3/C3/blcr-0.8.5/lib:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/scotch_6.0.0/lib:/mpi3/S3/MUMPS_4.10.0/lib:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/mpi3/C3/scalapack-2.0.2/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/pardiso:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/ARPACK:/mpi3/C3/hdf5-1.8.10-patch1/lib:/mpi3/C3/VTK-5.8.0/lib/vtk-5.8:/mpi3/C3/elmer_6283/lib:/mpi3/C3/Togl-1.7:/mpi3/C3/netgen_668/lib:/usr/lib/:/usr/local/lib://mpi3/C3/vrpn/lib://mpi3/C3/hidapi/lib:/usr/include/libusb-1.0
>     LC_TELEPHONE=fi_FI.UTF-8
>     XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session0
>     JPK_OCC=/usr/include/oce
>     XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0
>     HYDRA_HOST_FILE=/mpi4/hosts
>     SSH_AUTH_SOCK=/tmp/ssh-NnhxNTH12143/agent.12143
>     SCOTCHDIR=/mpi3/C3/scotch_6.0.0
>     HYDRA_LAUNCHER=rsh
>     JPK_VER_B=B3
>     SESSION_MANAGER=local/mpi1:@
> /tmp/.ICE-unix/4284,unix/mpi1:/tmp/.ICE-unix/4284
>     DEFAULTS_PATH=/usr/share/gconf/ubuntu.default.path
>     ELMER_HOME=/mpi3/C3/elmer_6283
>     BLACS=/mpi3/C3/scalapack-2.0.2
>     BLAS32=/mpi3/C3/acml5.3.1/gfortran64_mp
>     METIS_DIR=
>     MPI_LIBS=-L/mpi3/C3/mpich-3.0.4/lib -lmpich -lmpichf90 -lmpl -lopa
> -lmpichcxx
>     XDG_CONFIG_DIRS=/etc/xdg/xdg-ubuntu:/etc/xdg
>     JPK_MPI_DIR=/mpi3
>     JPK_HDF5_S=/mpi3/S3/hdf5-1.8.10-patch1
>     MPIEXEC_PORT_RANGE=7000:7500
>
> PATH=/mpi3/C3/cmake-2.8.10.2/bin:/mpi3/C3/blcr-0.8.5/bin:/mpi3/C3/mpich-3.0.4/bin:/usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/mpi3/C3/elmer_6283/bin:/mpi3/C3/elmer_6283/lib:/mpi3/C3/ParaView3
>     DESKTOP_SESSION=ubuntu
>     BLAS=/mpi3/C3/acml5.3.1/gfortran64_mp
>     METIS_LIBDIR=/mpi3/C3/ParMetis-3.2.0
>     CMAKE_COMMAND=/mpi3/C3/cmake-2.8.10.2/bin
>     QT_QMAKE_EXECUTABLE=/usr/bin/qmake-qt4
>     LC_IDENTIFICATION=fi_FI.UTF-8
>     JPK_SCOTCHDIR_S=/mpi3/S3/scotch_6.0.0_esmumps
>     JPK_LMETISDIR_S=/mpi3/S3/ParMetis-3.2.0
>     PWD=/mpi3/S3/mpich-3.0.4
>     NETGENDIR=/mpi3/C3/netgen_668/bin
>     EDITOR=nano
>     JPK_LMETISDIR=/mpi3/C3/ParMetis-3.2.0
>     GNOME_KEYRING_PID=4273
>     LANG=fi_FI.UTF-8
>     MANDATORY_PATH=/usr/share/gconf/ubuntu.mandatory.path
>     OMPI_77=//mpi3/C3/mpich-3.0.4/bin/mpif77
>     LC_MEASUREMENT=fi_FI.UTF-8
>     JPK_HDF5=/mpi3/C3/hdf5-1.8.10-patch1
>     UBUNTU_MENUPROXY=libappmenu.so
>     COMPIZ_CONFIG_PROFILE=ubuntu
>     ELMER_POST_HOME=/mpi3/C3/elmer_6283/bin
>     JPK_INS=/mpi3/C3
>     ELMER_LIB=/mpi3/C3/elmer_6283/share/elmersolver/lib
>     HYDRA_PROXY_RETRY_COUNT=3
>     GDMSESSION=ubuntu
>     JPK_ELMER_S=/mpi3/S3/elmer_6283
>     JPK_LMETISDIR5=/mpi3/C3/parmetis-4.0.2
>     JPK_LMETISDIR32=/mpi3/C3/ParMetis-3.2.0
>     HYDRA_DEBUG=1
>     JPK_BUI=/mpi3/S3
>     VTK_INCLUDEPATH=/mpi3/C3/VTK-5.8.0/include
>     SHLVL=1
>     HOME=/home/joni
>     OMPI_CC=//mpi3/C3/mpich-3.0.4/bin/mpicc
>     LANGUAGE=fi:en
>     OMPI_90=//mpi3/C3/mpich-3.0.4/bin/mpif90
>     ELMERGUI_HOME=/mpi3/C3/elmer_6283/bin
>     GNOME_DESKTOP_SESSION_ID=this-is-deprecated
>     MPI_IMPLEMENTATION=mpich
>     MKL_SERIAL=YES
>     LOGNAME=joni
>     HYPRE=/mpi3/C3/hypre-2.8.0b
>     JPK_ARPACK_S=/mpi3/S3/ARPACK
>     JPK_JOBS=7
>     JPK_VTK_DIR=/mpi3/C3/VTK-5.8.0
>     SCALAPACK=/mpi3/C3/scalapack-2.0.2
>
> XDG_DATA_DIRS=/usr/share/ubuntu:/usr/share/gnome:/usr/local/share/:/usr/share/
>
> DBUS_SESSION_BUS_ADDRESS=unix:abstract=/tmp/dbus-gAYmjGaklf,guid=26c3f15a7a5ee55e8782415700000034
>     JPK_ARPACK=/mpi3/C3/ARPACK
>     MPI_HOME=/mpi3/C3/mpich-3.0.4
>     LESSOPEN=| /usr/bin/lesspipe %s
>     LACPACK=/mpi3/C3/acml5.3.1/gfortran64_mp
>     OMPI_CXX=//mpi3/C3/mpich-3.0.4/bin/mpicxx
>     OMP_NUM_THREADS=6
>     JPK_TOGL_S=/mpi3/S3/Togl-1.7
>     HYDRA_LAUNCHER_EXEC=/usr/bin/rsh-redone-rsh
>     JPK_MPICH2=/mpi3/C3/mpich-3.0.4
>     PARDISO=/mpi3/C3/pardiso
>     PARDISOLICMESSAGE=1
>     JPK_VER=C3
>     XDG_CURRENT_DESKTOP=Unity
>     LESSCLOSE=/usr/bin/lesspipe %s %s
>     LC_TIME=fi_FI.UTF-8
>     JPK_HYPRE_S=/mpi3/S3/hypre-2.8.0b
>     JPK_MPICH2_S=/mpi3/S3/mpich-3.0.4
>     COLORTERM=gnome-terminal
>     XAUTHORITY=/home/joni/.Xauthority
>     LC_NAME=fi_FI.UTF-8
>     _=/mpi3/C3/mpich-3.0.4/bin/mpiexec
>     OLDPWD=/home/joni
>
>   Hydra internal environment:
>   ---------------------------
>     MPICH_ENABLE_CKPOINT=1
>     GFORTRAN_UNBUFFERED_PRECONNECTED=y
>
>
>     Proxy information:
>     *********************
>       [1] proxy: 192.168.0.41 (2 cores)
>       Exec list: hostname (2 processes);
>
>       [2] proxy: 192.168.0.42 (2 cores)
>       Exec list: hostname (2 processes);
>
>       [3] proxy: 192.168.0.43 (2 cores)
>       Exec list: hostname (2 processes);
>
>
>
> ==================================================================================================
>
> [mpiexec at mpi1] Timeout set to -1 (-1 means infinite)
> [mpiexec at mpi1] Got a control port string of 192.168.0.41:7001
>
> Proxy launch args: /mpi3/C3/mpich-3.0.4/bin/hydra_pmi_proxy --control-port
> 192.168.0.41:7001 --debug --rmk user --launcher rsh --launcher-exec
> /usr/bin/rsh-redone-rsh --demux poll --pgid 0 --retries 3 --usize -2
> --proxy-id
>
> Arguments being passed to proxy 0:
> --version 3.0.4 --iface-ip-env-name MPICH_INTERFACE_HOSTNAME --hostname
> 192.168.0.41 --global-core-map 0,2,6 --pmi-id-map 0,0
> --global-process-count 6 --auto-cleanup 1 --pmi-kvsname kvs_12243_0
> --pmi-process-mapping (vector,(0,3,2)) --ckpoint-prefix
> /mpi3/chekpoint/default.chk --ckpoint-num -1 --global-inherited-env 121
> 'MUMPS=/mpi3/S3/MUMPS_4.10.0' 'LC_PAPER=fi_FI.UTF-8'
> 'LC_ADDRESS=fi_FI.UTF-8' 'SSH_AGENT_PID=12144' 'LC_MONETARY=fi_FI.UTF-8'
> 'MUMPS_I=/mpi3/C3/MUMPS_4.10.0' 'HYDRA_DEMUX=select'
> 'GPG_AGENT_INFO=/tmp/keyring-kJwpJQ/gpg:0:1'
> 'JPK_LMETISDIR_S5=/mpi3/S3/parmetis-4.0.2' 'TERM=xterm' 'SHELL=/bin/bash'
> 'XDG_SESSION_COOKIE=6d6390cb56a32b6678c10da600000412-1377606907.629665-1922379047'
> 'FFT=/mpi3/C3/fftw2' 'HYDRA_ENV=all' 'JPK_NETGEN=/mpi3/C3/netgen_668'
> 'JPK_VER_S=S3' 'HYDRA_CKPOINTLIB=blcr' 'HYDRA_CKPOINT_INTERVAL=10800'
> 'WINDOWID=54602522' 'LC_NUMERIC=fi_FI.UTF-8'
> 'HYDRA_CKPOINT_PREFIX=/mpi3/chekpoint/default.chk'
> 'GNOME_KEYRING_CONTROL=/tmp/keyring-kJwpJQ' 'JPK_ELMER=/mpi3/C3/elmer_6283'
> 'PARDISO_LIC_PATH=/mpi3/C3/pardiso'
> 'METIS_INCLUDE_DIR=/mpi3/C3/ParMetis-3.2.0'
> 'JPK_NETGEN_S=/mpi3/S3/netgen_668' 'USER=joni'
> 'LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.axa=00;36:*.oga=00;36:*.spx=00;36:*.xspf=00;36:'
> 'JPK_TOGL=/mpi3/C3/Togl-1.7'
> 'LD_LIBRARY_PATH=/mpi3/C3/mpich-3.0.4/lib:/mpi3/C3/mpich-3.0.4/bin:/mpi3/C3/blcr-0.8.5/lib:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/scotch_6.0.0/lib:/mpi3/S3/MUMPS_4.10.0/lib:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/mpi3/C3/scalapack-2.0.2/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/pardiso:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/ARPACK:/mpi3/C3/hdf5-1.8.10-patch1/lib:/mpi3/C3/VTK-5.8.0/lib/vtk-5.8:/mpi3/C3/elmer_6283/lib:/mpi3/C3/Togl-1.7:/mpi3/C3/netgen_668/lib:/usr/lib/:/usr/local/lib://mpi3/C3/vrpn/lib://mpi3/C3/hidapi/lib:/usr/include/libusb-1.0'
> 'LC_TELEPHONE=fi_FI.UTF-8'
> 'XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session0'
> 'JPK_OCC=/usr/include/oce'
> 'XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0'
> 'HYDRA_HOST_FILE=/mpi4/hosts'
> 'SSH_AUTH_SOCK=/tmp/ssh-NnhxNTH12143/agent.12143'
> 'SCOTCHDIR=/mpi3/C3/scotch_6.0.0' 'HYDRA_LAUNCHER=rsh' 'JPK_VER_B=B3'
> 'SESSION_MANAGER=local/mpi1:@/tmp/.ICE-unix/4284,unix/mpi1:/tmp/.ICE-unix/4284'
> 'DEFAULTS_PATH=/usr/share/gconf/ubuntu.default.path'
> 'ELMER_HOME=/mpi3/C3/elmer_6283' 'BLACS=/mpi3/C3/scalapack-2.0.2'
> 'BLAS32=/mpi3/C3/acml5.3.1/gfortran64_mp' 'METIS_DIR='
> 'MPI_LIBS=-L/mpi3/C3/mpich-3.0.4/lib -lmpich -lmpichf90 -lmpl -lopa
> -lmpichcxx' 'XDG_CONFIG_DIRS=/etc/xdg/xdg-ubuntu:/etc/xdg'
> 'JPK_MPI_DIR=/mpi3' 'JPK_HDF5_S=/mpi3/S3/hdf5-1.8.10-patch1'
> 'MPIEXEC_PORT_RANGE=7000:7500'
> 'PATH=/mpi3/C3/cmake-2.8.10.2/bin:/mpi3/C3/blcr-0.8.5/bin:/mpi3/C3/mpich-3.0.4/bin:/usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/mpi3/C3/elmer_6283/bin:/mpi3/C3/elmer_6283/lib:/mpi3/C3/ParaView3'
> 'DESKTOP_SESSION=ubuntu' 'BLAS=/mpi3/C3/acml5.3.1/gfortran64_mp'
> 'METIS_LIBDIR=/mpi3/C3/ParMetis-3.2.0'
> 'CMAKE_COMMAND=/mpi3/C3/cmake-2.8.10.2/bin'
> 'QT_QMAKE_EXECUTABLE=/usr/bin/qmake-qt4' 'LC_IDENTIFICATION=fi_FI.UTF-8'
> 'JPK_SCOTCHDIR_S=/mpi3/S3/scotch_6.0.0_esmumps'
> 'JPK_LMETISDIR_S=/mpi3/S3/ParMetis-3.2.0' 'PWD=/mpi3/S3/mpich-3.0.4'
> 'NETGENDIR=/mpi3/C3/netgen_668/bin' 'EDITOR=nano'
> 'JPK_LMETISDIR=/mpi3/C3/ParMetis-3.2.0' 'GNOME_KEYRING_PID=4273'
> 'LANG=fi_FI.UTF-8' 'MANDATORY_PATH=/usr/share/gconf/ubuntu.mandatory.path'
> 'OMPI_77=//mpi3/C3/mpich-3.0.4/bin/mpif77' 'LC_MEASUREMENT=fi_FI.UTF-8'
> 'JPK_HDF5=/mpi3/C3/hdf5-1.8.10-patch1' 'UBUNTU_MENUPROXY=libappmenu.so'
> 'COMPIZ_CONFIG_PROFILE=ubuntu' 'ELMER_POST_HOME=/mpi3/C3/elmer_6283/bin'
> 'JPK_INS=/mpi3/C3' 'ELMER_LIB=/mpi3/C3/elmer_6283/share/elmersolver/lib'
> 'HYDRA_PROXY_RETRY_COUNT=3' 'GDMSESSION=ubuntu'
> 'JPK_ELMER_S=/mpi3/S3/elmer_6283' 'JPK_LMETISDIR5=/mpi3/C3/parmetis-4.0.2'
> 'JPK_LMETISDIR32=/mpi3/C3/ParMetis-3.2.0' 'HYDRA_DEBUG=1'
> 'JPK_BUI=/mpi3/S3' 'VTK_INCLUDEPATH=/mpi3/C3/VTK-5.8.0/include' 'SHLVL=1'
> 'HOME=/home/joni' 'OMPI_CC=//mpi3/C3/mpich-3.0.4/bin/mpicc'
> 'LANGUAGE=fi:en' 'OMPI_90=//mpi3/C3/mpich-3.0.4/bin/mpif90'
> 'ELMERGUI_HOME=/mpi3/C3/elmer_6283/bin'
> 'GNOME_DESKTOP_SESSION_ID=this-is-deprecated' 'MPI_IMPLEMENTATION=mpich'
> 'MKL_SERIAL=YES' 'LOGNAME=joni' 'HYPRE=/mpi3/C3/hypre-2.8.0b'
> 'JPK_ARPACK_S=/mpi3/S3/ARPACK' 'JPK_JOBS=7'
> 'JPK_VTK_DIR=/mpi3/C3/VTK-5.8.0' 'SCALAPACK=/mpi3/C3/scalapack-2.0.2'
> 'XDG_DATA_DIRS=/usr/share/ubuntu:/usr/share/gnome:/usr/local/share/:/usr/share/'
> 'DBUS_SESSION_BUS_ADDRESS=unix:abstract=/tmp/dbus-gAYmjGaklf,guid=26c3f15a7a5ee55e8782415700000034'
> 'JPK_ARPACK=/mpi3/C3/ARPACK' 'MPI_HOME=/mpi3/C3/mpich-3.0.4' 'LESSOPEN=|
> /usr/bin/lesspipe %s' 'LACPACK=/mpi3/C3/acml5.3.1/gfortran64_mp'
> 'OMPI_CXX=//mpi3/C3/mpich-3.0.4/bin/mpicxx' 'OMP_NUM_THREADS=6'
> 'JPK_TOGL_S=/mpi3/S3/Togl-1.7'
> 'HYDRA_LAUNCHER_EXEC=/usr/bin/rsh-redone-rsh'
> 'JPK_MPICH2=/mpi3/C3/mpich-3.0.4' 'PARDISO=/mpi3/C3/pardiso'
> 'PARDISOLICMESSAGE=1' 'JPK_VER=C3' 'XDG_CURRENT_DESKTOP=Unity'
> 'LESSCLOSE=/usr/bin/lesspipe %s %s' 'LC_TIME=fi_FI.UTF-8'
> 'JPK_HYPRE_S=/mpi3/S3/hypre-2.8.0b' 'JPK_MPICH2_S=/mpi3/S3/mpich-3.0.4'
> 'COLORTERM=gnome-terminal' 'XAUTHORITY=/home/joni/.Xauthority'
> 'LC_NAME=fi_FI.UTF-8' '_=/mpi3/C3/mpich-3.0.4/bin/mpiexec'
> 'OLDPWD=/home/joni' --global-user-env 0 --global-system-env 2
> 'MPICH_ENABLE_CKPOINT=1' 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --genv-prop
> all --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2
> --exec-local-env 0 --exec-wdir /mpi3/S3/mpich-3.0.4 --exec-args 1 hostname
>
> Arguments being passed to proxy 1:
> --version 3.0.4 --iface-ip-env-name MPICH_INTERFACE_HOSTNAME --hostname
> 192.168.0.42 --global-core-map 0,2,6 --pmi-id-map 0,2
> --global-process-count 6 --auto-cleanup 1 --pmi-kvsname kvs_12243_0
> --pmi-process-mapping (vector,(0,3,2)) --ckpoint-prefix
> /mpi3/chekpoint/default.chk --ckpoint-num -1 --global-inherited-env 121
> 'MUMPS=/mpi3/S3/MUMPS_4.10.0' 'LC_PAPER=fi_FI.UTF-8'
> 'LC_ADDRESS=fi_FI.UTF-8' 'SSH_AGENT_PID=12144' 'LC_MONETARY=fi_FI.UTF-8'
> 'MUMPS_I=/mpi3/C3/MUMPS_4.10.0' 'HYDRA_DEMUX=select'
> 'GPG_AGENT_INFO=/tmp/keyring-kJwpJQ/gpg:0:1'
> 'JPK_LMETISDIR_S5=/mpi3/S3/parmetis-4.0.2' 'TERM=xterm' 'SHELL=/bin/bash'
> 'XDG_SESSION_COOKIE=6d6390cb56a32b6678c10da600000412-1377606907.629665-1922379047'
> 'FFT=/mpi3/C3/fftw2' 'HYDRA_ENV=all' 'JPK_NETGEN=/mpi3/C3/netgen_668'
> 'JPK_VER_S=S3' 'HYDRA_CKPOINTLIB=blcr' 'HYDRA_CKPOINT_INTERVAL=10800'
> 'WINDOWID=54602522' 'LC_NUMERIC=fi_FI.UTF-8'
> 'HYDRA_CKPOINT_PREFIX=/mpi3/chekpoint/default.chk'
> 'GNOME_KEYRING_CONTROL=/tmp/keyring-kJwpJQ' 'JPK_ELMER=/mpi3/C3/elmer_6283'
> 'PARDISO_LIC_PATH=/mpi3/C3/pardiso'
> 'METIS_INCLUDE_DIR=/mpi3/C3/ParMetis-3.2.0'
> 'JPK_NETGEN_S=/mpi3/S3/netgen_668' 'USER=joni'
> 'LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.axa=00;36:*.oga=00;36:*.spx=00;36:*.xspf=00;36:'
> 'JPK_TOGL=/mpi3/C3/Togl-1.7'
> 'LD_LIBRARY_PATH=/mpi3/C3/mpich-3.0.4/lib:/mpi3/C3/mpich-3.0.4/bin:/mpi3/C3/blcr-0.8.5/lib:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/scotch_6.0.0/lib:/mpi3/S3/MUMPS_4.10.0/lib:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/mpi3/C3/scalapack-2.0.2/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/pardiso:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/ARPACK:/mpi3/C3/hdf5-1.8.10-patch1/lib:/mpi3/C3/VTK-5.8.0/lib/vtk-5.8:/mpi3/C3/elmer_6283/lib:/mpi3/C3/Togl-1.7:/mpi3/C3/netgen_668/lib:/usr/lib/:/usr/local/lib://mpi3/C3/vrpn/lib://mpi3/C3/hidapi/lib:/usr/include/libusb-1.0'
> 'LC_TELEPHONE=fi_FI.UTF-8'
> 'XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session0'
> 'JPK_OCC=/usr/include/oce'
> 'XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0'
> 'HYDRA_HOST_FILE=/mpi4/hosts'
> 'SSH_AUTH_SOCK=/tmp/ssh-NnhxNTH12143/agent.12143'
> 'SCOTCHDIR=/mpi3/C3/scotch_6.0.0' 'HYDRA_LAUNCHER=rsh' 'JPK_VER_B=B3'
> 'SESSION_MANAGER=local/mpi1:@/tmp/.ICE-unix/4284,unix/mpi1:/tmp/.ICE-unix/4284'
> 'DEFAULTS_PATH=/usr/share/gconf/ubuntu.default.path'
> 'ELMER_HOME=/mpi3/C3/elmer_6283' 'BLACS=/mpi3/C3/scalapack-2.0.2'
> 'BLAS32=/mpi3/C3/acml5.3.1/gfortran64_mp' 'METIS_DIR='
> 'MPI_LIBS=-L/mpi3/C3/mpich-3.0.4/lib -lmpich -lmpichf90 -lmpl -lopa
> -lmpichcxx' 'XDG_CONFIG_DIRS=/etc/xdg/xdg-ubuntu:/etc/xdg'
> 'JPK_MPI_DIR=/mpi3' 'JPK_HDF5_S=/mpi3/S3/hdf5-1.8.10-patch1'
> 'MPIEXEC_PORT_RANGE=7000:7500'
> 'PATH=/mpi3/C3/cmake-2.8.10.2/bin:/mpi3/C3/blcr-0.8.5/bin:/mpi3/C3/mpich-3.0.4/bin:/usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/mpi3/C3/elmer_6283/bin:/mpi3/C3/elmer_6283/lib:/mpi3/C3/ParaView3'
> 'DESKTOP_SESSION=ubuntu' 'BLAS=/mpi3/C3/acml5.3.1/gfortran64_mp'
> 'METIS_LIBDIR=/mpi3/C3/ParMetis-3.2.0'
> 'CMAKE_COMMAND=/mpi3/C3/cmake-2.8.10.2/bin'
> 'QT_QMAKE_EXECUTABLE=/usr/bin/qmake-qt4' 'LC_IDENTIFICATION=fi_FI.UTF-8'
> 'JPK_SCOTCHDIR_S=/mpi3/S3/scotch_6.0.0_esmumps'
> 'JPK_LMETISDIR_S=/mpi3/S3/ParMetis-3.2.0' 'PWD=/mpi3/S3/mpich-3.0.4'
> 'NETGENDIR=/mpi3/C3/netgen_668/bin' 'EDITOR=nano'
> 'JPK_LMETISDIR=/mpi3/C3/ParMetis-3.2.0' 'GNOME_KEYRING_PID=4273'
> 'LANG=fi_FI.UTF-8' 'MANDATORY_PATH=/usr/share/gconf/ubuntu.mandatory.path'
> 'OMPI_77=//mpi3/C3/mpich-3.0.4/bin/mpif77' 'LC_MEASUREMENT=fi_FI.UTF-8'
> 'JPK_HDF5=/mpi3/C3/hdf5-1.8.10-patch1' 'UBUNTU_MENUPROXY=libappmenu.so'
> 'COMPIZ_CONFIG_PROFILE=ubuntu' 'ELMER_POST_HOME=/mpi3/C3/elmer_6283/bin'
> 'JPK_INS=/mpi3/C3' 'ELMER_LIB=/mpi3/C3/elmer_6283/share/elmersolver/lib'
> 'HYDRA_PROXY_RETRY_COUNT=3' 'GDMSESSION=ubuntu'
> 'JPK_ELMER_S=/mpi3/S3/elmer_6283' 'JPK_LMETISDIR5=/mpi3/C3/parmetis-4.0.2'
> 'JPK_LMETISDIR32=/mpi3/C3/ParMetis-3.2.0' 'HYDRA_DEBUG=1'
> 'JPK_BUI=/mpi3/S3' 'VTK_INCLUDEPATH=/mpi3/C3/VTK-5.8.0/include' 'SHLVL=1'
> 'HOME=/home/joni' 'OMPI_CC=//mpi3/C3/mpich-3.0.4/bin/mpicc'
> 'LANGUAGE=fi:en' 'OMPI_90=//mpi3/C3/mpich-3.0.4/bin/mpif90'
> 'ELMERGUI_HOME=/mpi3/C3/elmer_6283/bin'
> 'GNOME_DESKTOP_SESSION_ID=this-is-deprecated' 'MPI_IMPLEMENTATION=mpich'
> 'MKL_SERIAL=YES' 'LOGNAME=joni' 'HYPRE=/mpi3/C3/hypre-2.8.0b'
> 'JPK_ARPACK_S=/mpi3/S3/ARPACK' 'JPK_JOBS=7'
> 'JPK_VTK_DIR=/mpi3/C3/VTK-5.8.0' 'SCALAPACK=/mpi3/C3/scalapack-2.0.2'
> 'XDG_DATA_DIRS=/usr/share/ubuntu:/usr/share/gnome:/usr/local/share/:/usr/share/'
> 'DBUS_SESSION_BUS_ADDRESS=unix:abstract=/tmp/dbus-gAYmjGaklf,guid=26c3f15a7a5ee55e8782415700000034'
> 'JPK_ARPACK=/mpi3/C3/ARPACK' 'MPI_HOME=/mpi3/C3/mpich-3.0.4' 'LESSOPEN=|
> /usr/bin/lesspipe %s' 'LACPACK=/mpi3/C3/acml5.3.1/gfortran64_mp'
> 'OMPI_CXX=//mpi3/C3/mpich-3.0.4/bin/mpicxx' 'OMP_NUM_THREADS=6'
> 'JPK_TOGL_S=/mpi3/S3/Togl-1.7'
> 'HYDRA_LAUNCHER_EXEC=/usr/bin/rsh-redone-rsh'
> 'JPK_MPICH2=/mpi3/C3/mpich-3.0.4' 'PARDISO=/mpi3/C3/pardiso'
> 'PARDISOLICMESSAGE=1' 'JPK_VER=C3' 'XDG_CURRENT_DESKTOP=Unity'
> 'LESSCLOSE=/usr/bin/lesspipe %s %s' 'LC_TIME=fi_FI.UTF-8'
> 'JPK_HYPRE_S=/mpi3/S3/hypre-2.8.0b' 'JPK_MPICH2_S=/mpi3/S3/mpich-3.0.4'
> 'COLORTERM=gnome-terminal' 'XAUTHORITY=/home/joni/.Xauthority'
> 'LC_NAME=fi_FI.UTF-8' '_=/mpi3/C3/mpich-3.0.4/bin/mpiexec'
> 'OLDPWD=/home/joni' --global-user-env 0 --global-system-env 2
> 'MPICH_ENABLE_CKPOINT=1' 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --genv-prop
> all --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2
> --exec-local-env 0 --exec-wdir /mpi3/S3/mpich-3.0.4 --exec-args 1 hostname
>
> Arguments being passed to proxy 2:
> --version 3.0.4 --iface-ip-env-name MPICH_INTERFACE_HOSTNAME --hostname
> 192.168.0.43 --global-core-map 0,2,6 --pmi-id-map 0,4
> --global-process-count 6 --auto-cleanup 1 --pmi-kvsname kvs_12243_0
> --pmi-process-mapping (vector,(0,3,2)) --ckpoint-prefix
> /mpi3/chekpoint/default.chk --ckpoint-num -1 --global-inherited-env 121
> 'MUMPS=/mpi3/S3/MUMPS_4.10.0' 'LC_PAPER=fi_FI.UTF-8'
> 'LC_ADDRESS=fi_FI.UTF-8' 'SSH_AGENT_PID=12144' 'LC_MONETARY=fi_FI.UTF-8'
> 'MUMPS_I=/mpi3/C3/MUMPS_4.10.0' 'HYDRA_DEMUX=select'
> 'GPG_AGENT_INFO=/tmp/keyring-kJwpJQ/gpg:0:1'
> 'JPK_LMETISDIR_S5=/mpi3/S3/parmetis-4.0.2' 'TERM=xterm' 'SHELL=/bin/bash'
> 'XDG_SESSION_COOKIE=6d6390cb56a32b6678c10da600000412-1377606907.629665-1922379047'
> 'FFT=/mpi3/C3/fftw2' 'HYDRA_ENV=all' 'JPK_NETGEN=/mpi3/C3/netgen_668'
> 'JPK_VER_S=S3' 'HYDRA_CKPOINTLIB=blcr' 'HYDRA_CKPOINT_INTERVAL=10800'
> 'WINDOWID=54602522' 'LC_NUMERIC=fi_FI.UTF-8'
> 'HYDRA_CKPOINT_PREFIX=/mpi3/chekpoint/default.chk'
> 'GNOME_KEYRING_CONTROL=/tmp/keyring-kJwpJQ' 'JPK_ELMER=/mpi3/C3/elmer_6283'
> 'PARDISO_LIC_PATH=/mpi3/C3/pardiso'
> 'METIS_INCLUDE_DIR=/mpi3/C3/ParMetis-3.2.0'
> 'JPK_NETGEN_S=/mpi3/S3/netgen_668' 'USER=joni'
> 'LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.axa=00;36:*.oga=00;36:*.spx=00;36:*.xspf=00;36:'
> 'JPK_TOGL=/mpi3/C3/Togl-1.7'
> 'LD_LIBRARY_PATH=/mpi3/C3/mpich-3.0.4/lib:/mpi3/C3/mpich-3.0.4/bin:/mpi3/C3/blcr-0.8.5/lib:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/scotch_6.0.0/lib:/mpi3/S3/MUMPS_4.10.0/lib:/mpi3/C3/acml5.3.1/gfortran64_mp/lib:/mpi3/C3/scalapack-2.0.2/lib:/mpi3/C3/hypre-2.8.0b/lib:/mpi3/C3/pardiso:/mpi3/C3/ParMetis-3.2.0:/mpi3/C3/ARPACK:/mpi3/C3/hdf5-1.8.10-patch1/lib:/mpi3/C3/VTK-5.8.0/lib/vtk-5.8:/mpi3/C3/elmer_6283/lib:/mpi3/C3/Togl-1.7:/mpi3/C3/netgen_668/lib:/usr/lib/:/usr/local/lib://mpi3/C3/vrpn/lib://mpi3/C3/hidapi/lib:/usr/include/libusb-1.0'
> 'LC_TELEPHONE=fi_FI.UTF-8'
> 'XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session0'
> 'JPK_OCC=/usr/include/oce'
> 'XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0'
> 'HYDRA_HOST_FILE=/mpi4/hosts'
> 'SSH_AUTH_SOCK=/tmp/ssh-NnhxNTH12143/agent.12143'
> 'SCOTCHDIR=/mpi3/C3/scotch_6.0.0' 'HYDRA_LAUNCHER=rsh' 'JPK_VER_B=B3'
> 'SESSION_MANAGER=local/mpi1:@/tmp/.ICE-unix/4284,unix/mpi1:/tmp/.ICE-unix/4284'
> 'DEFAULTS_PATH=/usr/share/gconf/ubuntu.default.path'
> 'ELMER_HOME=/mpi3/C3/elmer_6283' 'BLACS=/mpi3/C3/scalapack-2.0.2'
> 'BLAS32=/mpi3/C3/acml5.3.1/gfortran64_mp' 'METIS_DIR='
> 'MPI_LIBS=-L/mpi3/C3/mpich-3.0.4/lib -lmpich -lmpichf90 -lmpl -lopa
> -lmpichcxx' 'XDG_CONFIG_DIRS=/etc/xdg/xdg-ubuntu:/etc/xdg'
> 'JPK_MPI_DIR=/mpi3' 'JPK_HDF5_S=/mpi3/S3/hdf5-1.8.10-patch1'
> 'MPIEXEC_PORT_RANGE=7000:7500'
> 'PATH=/mpi3/C3/cmake-2.8.10.2/bin:/mpi3/C3/blcr-0.8.5/bin:/mpi3/C3/mpich-3.0.4/bin:/usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/mpi3/C3/elmer_6283/bin:/mpi3/C3/elmer_6283/lib:/mpi3/C3/ParaView3'
> 'DESKTOP_SESSION=ubuntu' 'BLAS=/mpi3/C3/acml5.3.1/gfortran64_mp'
> 'METIS_LIBDIR=/mpi3/C3/ParMetis-3.2.0'
> 'CMAKE_COMMAND=/mpi3/C3/cmake-2.8.10.2/bin'
> 'QT_QMAKE_EXECUTABLE=/usr/bin/qmake-qt4' 'LC_IDENTIFICATION=fi_FI.UTF-8'
> 'JPK_SCOTCHDIR_S=/mpi3/S3/scotch_6.0.0_esmumps'
> 'JPK_LMETISDIR_S=/mpi3/S3/ParMetis-3.2.0' 'PWD=/mpi3/S3/mpich-3.0.4'
> 'NETGENDIR=/mpi3/C3/netgen_668/bin' 'EDITOR=nano'
> 'JPK_LMETISDIR=/mpi3/C3/ParMetis-3.2.0' 'GNOME_KEYRING_PID=4273'
> 'LANG=fi_FI.UTF-8' 'MANDATORY_PATH=/usr/share/gconf/ubuntu.mandatory.path'
> 'OMPI_77=//mpi3/C3/mpich-3.0.4/bin/mpif77' 'LC_MEASUREMENT=fi_FI.UTF-8'
> 'JPK_HDF5=/mpi3/C3/hdf5-1.8.10-patch1' 'UBUNTU_MENUPROXY=libappmenu.so'
> 'COMPIZ_CONFIG_PROFILE=ubuntu' 'ELMER_POST_HOME=/mpi3/C3/elmer_6283/bin'
> 'JPK_INS=/mpi3/C3' 'ELMER_LIB=/mpi3/C3/elmer_6283/share/elmersolver/lib'
> 'HYDRA_PROXY_RETRY_COUNT=3' 'GDMSESSION=ubuntu'
> 'JPK_ELMER_S=/mpi3/S3/elmer_6283' 'JPK_LMETISDIR5=/mpi3/C3/parmetis-4.0.2'
> 'JPK_LMETISDIR32=/mpi3/C3/ParMetis-3.2.0' 'HYDRA_DEBUG=1'
> 'JPK_BUI=/mpi3/S3' 'VTK_INCLUDEPATH=/mpi3/C3/VTK-5.8.0/include' 'SHLVL=1'
> 'HOME=/home/joni' 'OMPI_CC=//mpi3/C3/mpich-3.0.4/bin/mpicc'
> 'LANGUAGE=fi:en' 'OMPI_90=//mpi3/C3/mpich-3.0.4/bin/mpif90'
> 'ELMERGUI_HOME=/mpi3/C3/elmer_6283/bin'
> 'GNOME_DESKTOP_SESSION_ID=this-is-deprecated' 'MPI_IMPLEMENTATION=mpich'
> 'MKL_SERIAL=YES' 'LOGNAME=joni' 'HYPRE=/mpi3/C3/hypre-2.8.0b'
> 'JPK_ARPACK_S=/mpi3/S3/ARPACK' 'JPK_JOBS=7'
> 'JPK_VTK_DIR=/mpi3/C3/VTK-5.8.0' 'SCALAPACK=/mpi3/C3/scalapack-2.0.2'
> 'XDG_DATA_DIRS=/usr/share/ubuntu:/usr/share/gnome:/usr/local/share/:/usr/share/'
> 'DBUS_SESSION_BUS_ADDRESS=unix:abstract=/tmp/dbus-gAYmjGaklf,guid=26c3f15a7a5ee55e8782415700000034'
> 'JPK_ARPACK=/mpi3/C3/ARPACK' 'MPI_HOME=/mpi3/C3/mpich-3.0.4' 'LESSOPEN=|
> /usr/bin/lesspipe %s' 'LACPACK=/mpi3/C3/acml5.3.1/gfortran64_mp'
> 'OMPI_CXX=//mpi3/C3/mpich-3.0.4/bin/mpicxx' 'OMP_NUM_THREADS=6'
> 'JPK_TOGL_S=/mpi3/S3/Togl-1.7'
> 'HYDRA_LAUNCHER_EXEC=/usr/bin/rsh-redone-rsh'
> 'JPK_MPICH2=/mpi3/C3/mpich-3.0.4' 'PARDISO=/mpi3/C3/pardiso'
> 'PARDISOLICMESSAGE=1' 'JPK_VER=C3' 'XDG_CURRENT_DESKTOP=Unity'
> 'LESSCLOSE=/usr/bin/lesspipe %s %s' 'LC_TIME=fi_FI.UTF-8'
> 'JPK_HYPRE_S=/mpi3/S3/hypre-2.8.0b' 'JPK_MPICH2_S=/mpi3/S3/mpich-3.0.4'
> 'COLORTERM=gnome-terminal' 'XAUTHORITY=/home/joni/.Xauthority'
> 'LC_NAME=fi_FI.UTF-8' '_=/mpi3/C3/mpich-3.0.4/bin/mpiexec'
> 'OLDPWD=/home/joni' --global-user-env 0 --global-system-env 2
> 'MPICH_ENABLE_CKPOINT=1' 'GFORTRAN_UNBUFFERED_PRECONNECTED=y' --genv-prop
> all --proxy-core-count 2 --exec --exec-appnum 0 --exec-proc-count 2
> --exec-local-env 0 --exec-wdir /mpi3/S3/mpich-3.0.4 --exec-args 1 hostname
>
> [mpiexec at mpi1] Launch arguments: /mpi3/C3/mpich-3.0.4/bin/hydra_pmi_proxy
> --control-port 192.168.0.41:7001 --debug --rmk user --launcher rsh
> --launcher-exec /usr/bin/rsh-redone-rsh --demux poll --pgid 0 --retries 3
> --usize -2 --proxy-id 0
> [mpiexec at mpi1] Launch arguments: /usr/bin/rsh-redone-rsh 192.168.0.42
> "/mpi3/C3/mpich-3.0.4/bin/hydra_pmi_proxy" --control-port
> 192.168.0.41:7001 --debug --rmk user --launcher rsh --launcher-exec
> /usr/bin/rsh-redone-rsh --demux poll --pgid 0 --retries 3 --usize -2
> --proxy-id 1
> [mpiexec at mpi1] Launch arguments: /usr/bin/rsh-redone-rsh 192.168.0.43
> "/mpi3/C3/mpich-3.0.4/bin/hydra_pmi_proxy" --control-port
> 192.168.0.41:7001 --debug --rmk user --launcher rsh --launcher-exec
> /usr/bin/rsh-redone-rsh --demux poll --pgid 0 --retries 3 --usize -2
> --proxy-id 2
> mpi1
> m
>
>
> 2013/8/27 Pavan Balaji <balaji at mcs.anl.gov>
>
>>
>> Please don't drop discuss at mpich.org from the cc list.
>>
>> I doubt demux, --assert-level and blcr are relevant here.  Also, the
>> output of "make testing" is not helpful for us because those tests can fail
>> even if your machines are too slow.
>>
>> Did you try my suggestion from the previous email?  Could you try them
>> and report back (just with that information)?
>>
>>  -- Pavan
>>
>>
>> On 08/27/2013 09:35 AM, Joni-Pekka Kurronen wrote:
>>
>>>
>>> hi,
>>>
>>> I have allraedy cheked up.
>>> This is not new install, just jumped up from mpich2 to 3 and
>>> gcc 4.6 to gcc 4.7.
>>>
>>> I have rsh-redone-rsh as main but tested ssh as well,...
>>> due at crash clinet's keep running at slave's and useing harddisk,...
>>>
>>> Following work's:
>>> any machine alone
>>> mpi1 and kaak  or  mpi1 and ugh
>>> but not all to gether except hostname,...
>>>
>>> This could be realated:
>>> - demux  ( have tried select and poll, whit poll have to restart slave
>>> machine's)
>>> - nfs4 ( some reason nfs4 must be manually mounted at moment after
>>> restart at slave's)
>>> - have changed --assert-level to 0 (default 2)
>>> - blcr
>>>
>>>
>>> ch3:socket setting:
>>>
>>> =============
>>> hosts file,..
>>> 192.168.0.41:2 <http://192.168.0.41:2>
>>> 192.168.0.42:2 <http://192.168.0.42:2>
>>> #192.168.0.43:2 <http://192.168.0.43:2>
>>>
>>> =============
>>> summary.xml errors
>>> <MPITEST>
>>> <NAME>spawninfo1</NAME>
>>> <NP>1</NP>
>>> <WORKDIR>./spawn</WORKDIR>
>>> <STATUS>fail</STATUS>
>>> <TESTDIFF>
>>> [mpiexec at mpi1] APPLICATION TIMED OUT
>>> [proxy:0:0 at mpi1] HYD_pmcd_pmip_control_cmd_cb
>>> (./pm/pmiserv/pmip_cb.c:886): assert (!closed) failed
>>> [proxy:0:0 at mpi1] HYDT_dmxu_poll_wait_for_event
>>> (./tools/demux/demux_poll.c:**77): callback returned error status
>>> [proxy:0:0 at mpi1] main (./pm/pmiserv/pmip.c:206): demux engine error
>>> waiting for event
>>> [proxy:1:0 at mpi1] HYD_pmcd_pmip_control_cmd_cb
>>> (./pm/pmiserv/pmip_cb.c:886): assert (!closed) failed
>>> [proxy:1:0 at mpi1] HYDT_dmxu_poll_wait_for_event
>>> (./tools/demux/demux_poll.c:**77): callback returned error status
>>> [proxy:1:0 at mpi1] main (./pm/pmiserv/pmip.c:206): demux engine error
>>> waiting for event
>>> [mpiexec at mpi1] HYDT_bscu_wait_for_completion
>>> (./tools/bootstrap/utils/bscu_**wait.c:76): one of the processes
>>> terminated badly; aborting
>>> [mpiexec at mpi1] HYDT_bsci_wait_for_completion
>>> (./tools/bootstrap/src/bsci_**wait.c:23): launcher returned error
>>> waiting
>>> for completion
>>> [mpiexec at mpi1] HYD_pmci_wait_for_completion
>>> (./pm/pmiserv/pmiserv_pmci.c:**188): launcher returned error waiting for
>>> completion
>>> [mpiexec at mpi1] main (./ui/mpich/mpiexec.c:331): process manager error
>>> waiting for completion
>>> </TESTDIFF>
>>> </MPITEST>
>>> <MPITEST>
>>> <NAME>rdwrord</NAME>
>>> <NP>4</NP>
>>> <WORKDIR>./io</WORKDIR>
>>> <STATUS>fail</STATUS>
>>> <TESTDIFF>
>>> Fatal error in PMPI_Bcast: Other MPI error
>>> [cli_2]: aborting job:
>>> Fatal error in PMPI_Bcast: Other MPI error
>>> Fatal error in PMPI_Bcast: Other MPI error
>>> [cli_0]: aborting job:
>>> Fatal error in PMPI_Bcast: Other MPI error
>>>
>>> ==============================**==============================**
>>> =======================
>>> =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
>>> =   EXIT CODE: 1
>>> =   CLEANING UP REMAINING PROCESSES
>>> =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
>>> ==============================**==============================**
>>> =======================
>>> </TESTDIFF>
>>> </MPITEST>
>>> <MPITEST>
>>> <NAME>rdwrzero</NAME>
>>> <NP>4</NP>
>>> <WORKDIR>./io</WORKDIR>
>>> <STATUS>fail</STATUS>
>>> <TESTDIFF>
>>> Fatal error in PMPI_Bcast: Other MPI error
>>> [cli_2]: aborting job:
>>> Fatal error in PMPI_Bcast: Other MPI error
>>> Fatal error in PMPI_Bcast: Other MPI error
>>> [cli_0]: aborting job:
>>> Fatal error in PMPI_Bcast: Other MPI error
>>>
>>> ==============================**==============================**
>>> =======================
>>> =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
>>> =   EXIT CODE: 1
>>> =   CLEANING UP REMAINING PROCESSES
>>> =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
>>> ==============================**==============================**
>>> =======================
>>> </TESTDIFF>
>>> </MPITEST>
>>> <MPITEST>
>>> <NAME>getextent</NAME>
>>> <NP>2</NP>
>>> <WORKDIR>./io</WORKDIR>
>>> <STATUS>pass</STATUS>
>>> </MPITEST>
>>> <MPITEST>
>>> <NAME>setinfo</NAME>
>>> <NP>4</NP>
>>> <WORKDIR>./io</WORKDIR>
>>> <STATUS>fail</STATUS>
>>> <TESTDIFF>
>>> Fatal error in PMPI_Bcast: Other MPI error
>>> [cli_2]: aborting job:
>>> Fatal error in PMPI_Bcast: Other MPI error
>>> Fatal error in PMPI_Bcast: Other MPI error
>>> [cli_0]: aborting job:
>>> Fatal error in PMPI_Bcast: Other MPI error
>>>
>>> ==============================**==============================**
>>> =======================
>>> =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
>>> =   EXIT CODE: 1
>>> =   CLEANING UP REMAINING PROCESSES
>>> =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
>>> ==============================**==============================**
>>> =======================
>>> </TESTDIFF>
>>> </MPITEST>
>>> <MPITEST>
>>> <NAME>setviewcur</NAME>
>>> <NP>4</NP>
>>> <WORKDIR>./io</WORKDIR>
>>> <STATUS>fail</STATUS>
>>> <TESTDIFF>
>>> Fatal error in PMPI_Bcast: Other MPI error
>>> [cli_2]: aborting job:
>>> Fatal error in PMPI_Bcast: Other MPI error
>>> Fatal error in PMPI_Bcast: Other MPI error
>>> [cli_0]: aborting job:
>>> Fatal error in PMPI_Bcast: Other MPI error
>>>
>>> ==============================**==============================**
>>> =======================
>>> =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
>>> =   EXIT CODE: 1
>>> =   CLEANING UP REMAINING PROCESSES
>>> =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
>>> ==============================**==============================**
>>> =======================
>>> </TESTDIFF>
>>> </MPITEST>
>>>
>>> ....
>>> ....
>>> ....
>>>
>>>
>>> ============
>>> ============
>>> hosts file,..
>>> 192.168.0.41:2 <http://192.168.0.41:2>
>>> 192.168.0.42:2 <http://192.168.0.42:2>
>>> 192.168.0.43:2 <http://192.168.0.43:2>
>>>
>>> ============
>>> ============
>>>
>>> When needed more than 4 process will hang,..
>>>
>>> Unexpected output in allred3: [mpiexec at mpi1] APPLICATION TIMED OUT
>>> Unexpected output in allred3: [proxy:0:0 at mpi1]
>>> HYD_pmcd_pmip_control_cmd_cb (./pm/pmiserv/pmip_cb.c:886): assert
>>> (!closed) failed
>>> Unexpected output in allred3: [proxy:0:0 at mpi1]
>>> HYDT_dmxu_poll_wait_for_event (./tools/demux/demux_poll.c:**77):
>>> callback
>>> returned error status
>>> Unexpected output in allred3: [proxy:0:0 at mpi1] main
>>> (./pm/pmiserv/pmip.c:206): demux engine error waiting for event
>>>
>>> ======
>>>
>>> if useing ch3:nemesis hardisk is not running all the time as whit
>>> sockets's,...
>>>
>>>
>>>
>>> ==============================**==================
>>> This is read by rsh-redone-rsh for every process,...
>>> ==============================**==================
>>> #!/bin/bash
>>>
>>> # JPK-Integration for Ubuntu 12.4 LTS
>>> #
>>> https://sites.google.com/site/**jpsdatareviewstheboy007/**
>>> ubuntu-lts-12-4-companion-**whit-ltsp-mpich2-elmer-**openfoam<https://sites.google.com/site/jpsdatareviewstheboy007/ubuntu-lts-12-4-companion-whit-ltsp-mpich2-elmer-openfoam>
>>> #
>>> # CMAKE goes loop and can not build, cmake build under development says
>>> documentation
>>>
>>> # gcc 4.7
>>> # bdver1 optimization
>>>
>>> shopt -s expand_aliases
>>> export JPK_MPI_DIR=/mpi3         # MAIN DIRECTORY, SUBDIRCTORYES:
>>> export JPK_VER=C3                # BINARY CODE
>>> export JPK_VER_S=S3              # SOURCE CODE
>>> export JPK_VER_B=B3              # BASH FILES TO COMPILE AND CONFIGURE
>>> export JPK_INS=$JPK_MPI_DIR/$JPK_VER
>>> export JPK_BUI=$JPK_MPI_DIR/$JPK_VER_**S
>>> export JPK_ELMER=$JPK_INS/elmer_6283 #035
>>> export JPK_ELMER_S=$JPK_BUI/elmer_**6283
>>> export JPK_NETGEN_S=$JPK_BUI/netgen_**668
>>> export JPK_NETGEN=$JPK_INS/netgen_668
>>>
>>> #GCC
>>> #export JPK_FLAGS="-Wl,--no-as-needed -fPIC -DAdd_ -m64 -pthread -O3
>>> -fopenmp -lgomp -march=bdver1 -ftree-vectorize -funroll-loops"
>>> #export CFLAGS="-Wl,--no-as-needed -fPIC -DAdd_ -m64 -pthread -fopenmp
>>> -lgomp"
>>>
>>> # M A K E
>>>
>>> export JPK_JOBS=7
>>>
>>> # O P E N  MP
>>> export OMP_NUM_THREADS=6
>>>
>>>
>>> # M P I C 3
>>> # http://wiki.mcs.anl.gov/**mpich2/index.php/Using_the_**
>>> Hydra_Process_Manager<http://wiki.mcs.anl.gov/mpich2/index.php/Using_the_Hydra_Process_Manager>
>>> export JPK_MPICH2_S=$JPK_BUI/mpich-3.**0.4
>>> export JPK_MPICH2=$JPK_INS/mpich-3.0.**4
>>> export PATH=$JPK_MPICH2/bin:$PATH
>>> export MPI_HOME=$JPK_MPICH2
>>> export MPI_LIBS="-L$JPK_MPICH2/lib -lmpich -lmpichf90 -lmpl -lopa
>>> -lmpichcxx"
>>> export LD_LIBRARY_PATH=$JPK_MPICH2/**lib:$JPK_MPICH2/bin # FIRST
>>>
>>> # M P I
>>>
>>> export MPI_IMPLEMENTATION=mpich
>>>
>>> export OMPI_CC=/$JPK_MPI_DIR/$JPK_**VER/mpich-3.0.4/bin/mpicc
>>> export OMPI_CXX=/$JPK_MPI_DIR/$JPK_**VER/mpich-3.0.4/bin/mpicxx
>>> export OMPI_77=/$JPK_MPI_DIR/$JPK_**VER/mpich-3.0.4/bin/mpif77
>>> export OMPI_90=/$JPK_MPI_DIR/$JPK_**VER/mpich-3.0.4/bin/mpif90
>>>
>>> # http://wiki.mcs.anl.gov/**mpich2/index.php/Using_the_**
>>> Hydra_Process_Manager<http://wiki.mcs.anl.gov/mpich2/index.php/Using_the_Hydra_Process_Manager>
>>> export HYDRA_DEBUG=0
>>> export HYDRA_HOST_FILE=/mpi4/hosts
>>> export HYDRA_LAUNCHER=rsh
>>> #export HYDRA_LAUNCHER=ssh
>>> #export HYDRA_LAUNCHER_EXEC=/usr/bin/**netkit-rsh
>>> export HYDRA_LAUNCHER_EXEC=/usr/bin/**rsh-redone-rsh
>>> #export HYDRA_LAUNCHER_EXEC=/usr/bin/**ssh
>>> export HYDRA_DEMUX=select
>>> #export HYDRA_DEMUX=select #more porseses than core's
>>> export HYDRA_PROXY_RETRY_COUNT=3
>>> #export HYDRA_RMK=pbs
>>> #export HYDRA_DEFAULT_RMK=pbs
>>> export HYDRA_ENV=all
>>> export MPIEXEC_PORT_RANGE=7000:7500
>>> #mpirun -launcher rsh -launcher-exec /usr/bin/netkit-rsh -demux select
>>> -n 21 ddd ./cpi
>>>
>>> # b l c r
>>>
>>> export HYDRA_CKPOINTLIB=blcr
>>> export HYDRA_CKPOINT_PREFIX=/mpi3/**chekpoint/default.chk
>>> export HYDRA_CKPOINT_INTERVAL=10800
>>> export PATH=$JPK_INS/blcr-0.8.5/bin:$**PATH
>>> export LD_LIBRARY_PATH=$LD_LIBRARY_**PATH:$JPK_INS/blcr-0.8.5/lib
>>> #-ckpoint-num 5
>>>
>>> # C M A K E - BUILD
>>> export PATH=$JPK_INS/cmake-2.8.10.2/**bin:$PATH
>>> export CMAKE_COMMAND=$JPK_INS/cmake-**2.8.10.2/bin
>>>
>>> # T O G L - UI netgen
>>>
>>> export JPK_TOGL="$JPK_INS/Togl-1.7"
>>> export JPK_TOGL_S="$JPK_BUI/Togl-1.7"
>>>
>>> # OCC
>>> export JPK_OCC=/usr/include/oce
>>>
>>> # M A T H
>>>
>>> export JPK_ARPACK_S=$JPK_BUI/ARPACK
>>> export JPK_ARPACK=$JPK_INS/ARPACK
>>>
>>> export BLAS=$JPK_INS/acml5.3.1/**gfortran64_mp
>>> export BLAS32=$JPK_INS/acml5.3.1/**gfortran64_mp
>>> #export BLAS=$JPK_INS/clAmdBlas-1.10.**321/lib64
>>> #export BLAS32=$JPK_INS/clAmdBlas-1.**10.321/include
>>> export FFT=$JPK_INS/fftw2
>>> export LACPACK=$BLAS
>>> export SCALAPACK=$JPK_INS/scalapack-**2.0.2
>>> export BLACS=$SCALAPACK
>>>
>>> export JPK_LMETISDIR_S=$JPK_BUI/**ParMetis-3.2.0
>>> export JPK_LMETISDIR=$JPK_INS/**ParMetis-3.2.0
>>> export JPK_LMETISDIR32=$JPK_LMETISDIR
>>> export JPK_LMETISDIR_S5=$JPK_BUI/**parmetis-4.0.2
>>> export JPK_LMETISDIR5=$JPK_INS/**parmetis-4.0.2
>>>
>>> export METIS_DIR="" #$JPK_LMETISDIR MUST BE EMPTY
>>> export METIS_INCLUDE_DIR=$JPK_**LMETISDIR
>>> export METIS_LIBDIR=$JPK_LMETISDIR
>>> export LD_LIBRARY_PATH=$LD_LIBRARY_**PATH:$JPK_LMETISDIR:$BLAS/lib:**
>>> $FTT/lib
>>> #/mpi4/S/metis-5.0.2/GKlib
>>>
>>> export SCOTCHDIR=$JPK_INS/scotch_6.0.**0
>>> export JPK_SCOTCHDIR_S=$JPK_BUI/**scotch_6.0.0_esmumps
>>>
>>> export MUMPS_I=$JPK_INS/MUMPS_4.10.0
>>> export MUMPS=$JPK_BUI/MUMPS_4.10.0
>>>
>>> export HYPRE=$JPK_INS/hypre-2.8.0b
>>> export LD_LIBRARY_PATH=$LD_LIBRARY_**PATH:$HYPRE/lib
>>> export JPK_HYPRE_S=$JPK_BUI/hypre-2.**8.0b
>>>
>>> export PARDISOLICMESSAGE=1
>>> export PARDISO=$JPK_INS/pardiso
>>> export PARDISO_LIC_PATH=$PARDISO
>>> export MKL_SERIAL=YES
>>>
>>> export
>>> LD_LIBRARY_PATH=$LD_LIBRARY_**PATH:$SCOTCHDIR/lib:$MUMPS/**
>>> lib:$BLAS/lib:$SCALAPACK/lib:$**HYPRE/lib:$PARDISO:$METIS_**
>>> LIBDIR:$JPK_ARPACK
>>>
>>> #HDF5
>>> #export JPK_HDF5_S=$JPK_BUI/hdf5-1.8.**10-patch1 for vtk testing
>>> #export JPK_HDF5=$JPK_INS/hdf5-1.8.10-**patch1
>>> export JPK_HDF5_S=$JPK_BUI/hdf5-1.8.**10-patch1
>>> export JPK_HDF5=$JPK_INS/hdf5-1.8.10-**patch1
>>> export LD_LIBRARY_PATH=$LD_LIBRARY_**PATH:$JPK_HDF5/lib
>>>
>>> # V T K
>>> export JPK_VTK_DIR=$JPK_INS/VTK-5.8.0
>>> export LD_LIBRARY_PATH=$LD_LIBRARY_**PATH:$JPK_VTK_DIR/lib/vtk-5.8
>>> export VTK_INCLUDEPATH=$JPK_VTK_DIR/**include
>>>
>>> # Q T
>>> export QT_QMAKE_EXECUTABLE=/usr/bin/**qmake-qt4
>>>
>>> # O P E N    F O A M
>>> # http://www.openfoam.org/**download/source.php<http://www.openfoam.org/download/source.php>
>>>
>>> #export WM_SCHEDULER=wmakeScheduler
>>> #export WM_HOSTS="192.168.0.41:6 <http://192.168.0.41:6> 192.168.0.42:6
>>> <http://192.168.0.42:6> 192.168.0.43:6 <http://192.168.0.43:6>"
>>>
>>> #export WM_NCOMPPROCS=$($WM_SCHEDULER -count)
>>> #export WM_COLOURS="black blue green cyan red magenta yellow"
>>>
>>> #export FOAM_INST_DIR=/mpi2/OpenFOAM
>>> #foamDotFile=$FOAM_INST_DIR/**OpenFOAM-2.1.x/etc/bashrc
>>> #[ -f $foamDotFile ] && . $foamDotFile
>>> #source /mpi3/OpenFOAM/OpenFOAM-2.1.x/**etc/bashrc
>>>
>>> #export FOAM_RUN=/mpi2/om
>>> #export OpenBIN=/mpi2/OpenFOAM/**OpenFOAM-2.1.x/bin/tools
>>> #export PATH=OpenBIN$:$PATH
>>> #export LD_LIBRARY_PATH=$LD_LIBRARY_**PATH:/apps/OpenFOAM/tools/**lib64
>>>
>>> #export
>>> ParaView_DIR=/mpi2/OpenFOAM/**ThirdParty-2.1.x/platforms/**
>>> linux64Gcc/paraview-3.12.0
>>> #export PATH=$ParaView_DIR/bin:$PATH
>>> #export PV_PLUGIN_PATH=$FOAM_LIBBIN/**paraview-3.12
>>>
>>> # E L M E R
>>> export ELMER_HOME=$JPK_ELMER
>>> export ELMER_LIB=$JPK_ELMER/share/**elmersolver/lib
>>> export PATH=$PATH:$ELMER_HOME/bin:$**ELMER_HOME/lib
>>> export LD_LIBRARY_PATH=$LD_LIBRARY_**PATH:$ELMER_HOME/lib
>>> export ELMERGUI_HOME=$ELMER_HOME/bin
>>> export ELMER_POST_HOME=$ELMER_HOME/**bin
>>>
>>> # S a l o m é
>>> #cd /mpi2/salome-meca/SALOME-MECA-**2012.2-LGPL ; source
>>> envSalomeMeca.sh
>>> #cd ~/
>>>
>>> # Paraview
>>> #export PATH=$PATH:$JPK_INS/ParaView-**3.14.1-Linux-64bit
>>> export PATH=$PATH:$JPK_INS/ParaView3
>>>
>>> # N E T G E N   P A R A L L E L $JPK_TCL/lib:$JPK_TK/lib:
>>> #export
>>> LD_LIBRARY_PATH=$LD_LIBRARY_**PATH:$JPK_TOGL:$JPK_NETGEN\**
>>> par/lib:/usr/lib/
>>> #export NETGENDIR=$JPK_NETGEN\par/bin
>>> # NETGEN
>>> export LD_LIBRARY_PATH=$LD_LIBRARY_**PATH:$JPK_TOGL:$JPK_NETGEN/**
>>> lib:/usr/lib/
>>> export NETGENDIR=$JPK_NETGEN/bin
>>>
>>> #crontab, ext editor
>>> export EDITOR=nano
>>>
>>> #space ball
>>> export LD_LIBRARY_PATH=$LD_LIBRARY_**PATH:/usr/local/lib
>>>
>>> #vrpn & hidapi
>>> export
>>> LD_LIBRARY_PATH=$LD_LIBRARY_**PATH:/$JPK_MPI_DIR/$JPK_VER/**
>>> vrpn/lib:/$JPK_MPI_DIR/$JPK_**VER/hidapi/lib:/usr/include/**libusb-1.0
>>>
>>>
>>>
>>>
>>>
>>> 2013/8/27 Pavan Balaji <balaji at mcs.anl.gov <mailto:balaji at mcs.anl.gov>>
>>>
>>>
>>>
>>>     This is almost certainly a network issue with your third machine
>>>     (kaak, I presume?).
>>>
>>>     Thanks for making sure "hostname" works fine on all machines.  That
>>>     means that your ssh connections are setup correctly.  But a non-MPI
>>>     program, such as hostname, does not check the connection from kaak
>>>     back to mpi1.
>>>
>>>     Can you try a simple program like "examples/cpi" in the build
>>>     directory on all machines?  Try it on 2 machines (mpiexec -np 4) and
>>>     3 machines (mpiexec -np 6).
>>>
>>>     If the third machine is in fact having problems running the
>>> application:
>>>
>>>     1. Make sure there's no firewall on the third machines.
>>>
>>>     2. Make sure the /etc/hosts file is consistent on both the machines
>>>     (mpi1 and kaak).
>>>
>>>       -- Pavan
>>>
>>>
>>>     On 08/27/2013 06:46 AM, Joni-Pekka Kurronen wrote:
>>>
>>>
>>>         I have:
>>>         -Ubuntu 12.4
>>>         -rsh-redo-rsh
>>>         -three machines
>>>         -mpich3
>>>         -have tried export HYDRA_DEMUX=select / poll
>>>         -have tried ssh/rsh
>>>         -have added to LIBS: event_core event_pthreads
>>>
>>>         I can run test at on to two machines whitout error but
>>>         when I take third machine to cluster demux engine goes mad,...
>>>            there is connection hanging,... and nothing happens,...
>>>
>>>
>>>         <MPITEST>
>>>         <NAME>uoplong</NAME>
>>>         <NP>11</NP>
>>>         <WORKDIR>./coll</WORKDIR>
>>>         <STATUS>fail</STATUS>
>>>         <TESTDIFF>
>>>         [mpiexec at mpi1] APPLICATION TIMED OUT
>>>         [proxy:0:0 at mpi1] HYD_pmcd_pmip_control_cmd_cb
>>>         (./pm/pmiserv/pmip_cb.c:886): assert (!closed) failed
>>>         [proxy:0:0 at mpi1] HYDT_dmxu_poll_wait_for_event
>>>         (./tools/demux/demux_poll.c:__**77): callback returned error
>>> status
>>>
>>>         [proxy:0:0 at mpi1] main (./pm/pmiserv/pmip.c:206): demux engine
>>> error
>>>         waiting for event
>>>         [mpiexec at mpi1] HYDT_bscu_wait_for_completion
>>>         (./tools/bootstrap/utils/bscu_**__wait.c:76): one of the
>>> processes
>>>
>>>         terminated badly; aborting
>>>         [mpiexec at mpi1] HYDT_bsci_wait_for_completion
>>>         (./tools/bootstrap/src/bsci___**wait.c:23): launcher returned
>>>
>>>         error waiting
>>>         for completion
>>>         [mpiexec at mpi1] HYD_pmci_wait_for_completion
>>>         (./pm/pmiserv/pmiserv_pmci.c:_**_188): launcher returned error
>>>
>>>         waiting for
>>>         completion
>>>         [mpiexec at mpi1] main (./ui/mpich/mpiexec.c:331): process manager
>>>         error
>>>         waiting for completion
>>>         </TESTDIFF>
>>>         </MPITEST>
>>>
>>>         Also I can run
>>>         joni at mpi1:/mpi3/S3/hpcc-1.4.2$ mpiexec -np 6 hostname
>>>         mpi1
>>>         mpi1
>>>         ugh
>>>         ugh
>>>         kaak
>>>         kaak
>>>
>>>         but if I run
>>>         joni at mpi1:/mpi3/S3/hpcc-1.4.2$ mpiexec -np 6 ls
>>>         I get only one directory as output and
>>>         system will cease until I have re-started slave machines !
>>>
>>>
>>>
>>>
>>>
>>>     --
>>>     Pavan Balaji
>>>     http://www.mcs.anl.gov/~balaji
>>>
>>>
>>>
>>>
>>> --
>>> Joni-Pekka Kurronen
>>> Joni.Kurronen at gmail.com <mailto:Joni.Kurronen at gmail.**com<Joni.Kurronen at gmail.com>
>>> >
>>>
>>> gsm. +358 50 521 2279
>>>
>>
>> --
>> Pavan Balaji
>> http://www.mcs.anl.gov/~balaji
>>
>
>
>
> --
> Joni-Pekka Kurronen
>
> Joni.Kurronen at gmail.com
> gsm. +358 50 521 2279
>



-- 
Joni-Pekka Kurronen
Joni.Kurronen at gmail.com
gsm. +358 50 521 2279
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20130828/f21a8b4a/attachment.html>


More information about the discuss mailing list