<html><head>
<meta http-equiv="Content-Type" content="text/html; charset=Windows-1252">
</head>
<body bgcolor="#FFFFFF" text="#000000">
Sure. Please send to <a class="moz-txt-link-abbreviated" href="mailto:msi@il.is.s.u-tokyo.ac.jp">msi@il.is.s.u-tokyo.ac.jp</a>.<br>
<br>
Min<br>
<br>
<div class="moz-cite-prefix">On 10/12/15 9:24 AM, Siegmar Gross
wrote:<br>
</div>
<blockquote cite="mid:561BC2B0.50603@informatik.hs-fulda.de" type="cite">Hi Min,
<br>
<br>
<blockquote type="cite">It seems you already enabled the most
detailed error outputs. We could
<br>
not think out any clue for now. If you can give us access to
your
<br>
machine, we are glad to help you debug on it.
<br>
</blockquote>
<br>
Can you send me your email address because I don't want to send
<br>
login data to this list.
<br>
<br>
<br>
Kind regards
<br>
<br>
Siegmar
<br>
<br>
<br>
<blockquote type="cite">
<br>
Min
<br>
<br>
On 10/8/15 12:02 AM, Siegmar Gross wrote:
<br>
<blockquote type="cite">Hi Min,
<br>
<br>
thank you very much for your answer.
<br>
<br>
<blockquote type="cite">We cannot reproduce this error on our
test machines (Solaris i386,
<br>
Ubuntu x86_64) by using your programs. And unfortunately we
do not have
<br>
Solaris Sparc machine thus could not verify it.
<br>
</blockquote>
<br>
The programs work fine on my Solaris x86_64 and Linux machines
<br>
as well. I only have a problem on Solaris Sparc.
<br>
<br>
<br>
<blockquote type="cite">Sometime, it can happen that you need
to add "./" in front of the
<br>
program path, could you try it ?
<br>
<br>
For example, in spawn_master.c MPI: A Message-Passing
Interface Standard
<br>
<blockquote type="cite">#define SLAVE_PROG
"./spawn_slave"
<br>
</blockquote>
</blockquote>
<br>
No, it wil not work, because the programs are stored in a
<br>
different directory ($HOME/{SunOS, Linux}/{sparc, x86_64}/bin)
<br>
which is part of PATH (as well as ".").
<br>
<br>
Can I do anything to track the source of the error?
<br>
<br>
<br>
Kind regards
<br>
<br>
Siegmar
<br>
<br>
<blockquote type="cite">
<br>
Min
<br>
<br>
On 10/7/15 5:03 AM, Siegmar Gross wrote:
<br>
<blockquote type="cite">Hi,
<br>
<br>
today I've built mpich-3.2rc1 on my machines (Solaris 10
Sparc,
<br>
Solaris 10 x86_64, and openSUSE Linux 12.1 x86_64) with
gcc-5.1.0
<br>
and Sun C 5.13. I still get the following errors on my
Sparc machine
<br>
which I'd already reported September 8th. "mpiexec" is
aliased to
<br>
'mpiexec -genvnone'. It still doesn't matter if I use my
cc- or
<br>
gcc-version of MPICH.
<br>
<br>
<br>
tyr spawn 119 mpichversion
<br>
MPICH Version: 3.2rc1
<br>
MPICH Release date: Wed Oct 7 00:00:33 CDT 2015
<br>
MPICH Device: ch3:nemesis
<br>
MPICH configure:
--prefix=/usr/local/mpich-3.2_64_cc
<br>
--libdir=/usr/local/mpich-3.2_64_cc/lib64
<br>
--includedir=/usr/local/mpich-3.2_64_cc/include64 CC=cc
CXX=CC F77=f77
<br>
FC=f95 CFLAGS=-m64 CXXFLAGS=-m64 FFLAGS=-m64 FCFLAGS=-m64
LDFLAGS=-m64
<br>
-L/usr/lib/sparcv9 -R/usr/lib/sparcv9 --enable-fortran=yes
<br>
--enable-cxx --enable-romio --enable-debuginfo
--enable-smpcoll
<br>
--enable-threads=multiple --with-thread-package=posix
--enable-shared
<br>
MPICH CC: cc -m64 -O2
<br>
MPICH CXX: CC -m64 -O2
<br>
MPICH F77: f77 -m64
<br>
MPICH FC: f95 -m64 -O2
<br>
tyr spawn 120
<br>
<br>
<br>
<br>
tyr spawn 111 mpiexec -np 1 spawn_master
<br>
<br>
Parent process 0 running on tyr.informatik.hs-fulda.de
<br>
I create 4 slave processes
<br>
<br>
Fatal error in MPI_Comm_spawn: Unknown error class, error
stack:
<br>
MPI_Comm_spawn(144)...........:
MPI_Comm_spawn(cmd="spawn_slave",
<br>
argv=0, maxprocs=4, MPI_INFO_NULL, root=0, MPI_COMM_WORLD,
<br>
intercomm=ffffffff7fffde50, errors=0) failed
<br>
MPIDI_Comm_spawn_multiple(274):
<br>
MPID_Comm_accept(153).........:
<br>
MPIDI_Comm_accept(1057).......:
<br>
MPIR_Bcast_intra(1287)........:
<br>
MPIR_Bcast_binomial(310)......: Failure during collective
<br>
<br>
<br>
<br>
<br>
tyr spawn 112 mpiexec -np 1 spawn_multiple_master
<br>
<br>
Parent process 0 running on tyr.informatik.hs-fulda.de
<br>
I create 3 slave processes.
<br>
<br>
Fatal error in MPI_Comm_spawn_multiple: Unknown error
class, error
<br>
stack:
<br>
MPI_Comm_spawn_multiple(162)..:
MPI_Comm_spawn_multiple(count=2,
<br>
cmds=ffffffff7fffde08, argvs=ffffffff7fffddf8,
<br>
maxprocs=ffffffff7fffddf0, infos=ffffffff7fffdde8, root=0,
<br>
MPI_COMM_WORLD, intercomm=ffffffff7fffdde4, errors=0)
failed
<br>
MPIDI_Comm_spawn_multiple(274):
<br>
MPID_Comm_accept(153).........:
<br>
MPIDI_Comm_accept(1057).......:
<br>
MPIR_Bcast_intra(1287)........:
<br>
MPIR_Bcast_binomial(310)......: Failure during collective
<br>
<br>
<br>
<br>
<br>
tyr spawn 113 mpiexec -np 1 spawn_intra_comm
<br>
Parent process 0: I create 2 slave processes
<br>
Fatal error in MPI_Comm_spawn: Unknown error class, error
stack:
<br>
MPI_Comm_spawn(144)...........:
MPI_Comm_spawn(cmd="spawn_intra_comm",
<br>
argv=0, maxprocs=2, MPI_INFO_NULL, root=0, MPI_COMM_WORLD,
<br>
intercomm=ffffffff7fffded4, errors=0) failed
<br>
MPIDI_Comm_spawn_multiple(274):
<br>
MPID_Comm_accept(153).........:
<br>
MPIDI_Comm_accept(1057).......:
<br>
MPIR_Bcast_intra(1287)........:
<br>
MPIR_Bcast_binomial(310)......: Failure during collective
<br>
tyr spawn 114
<br>
<br>
<br>
I would be grateful if somebody can fix the problem. Thank
you very
<br>
much for any help in advance. I've attached my programs.
Please let
<br>
me know if you need anything else.
<br>
<br>
<br>
Kind regards
<br>
<br>
Siegmar
<br>
<br>
<br>
_______________________________________________
<br>
discuss mailing <a class="moz-txt-link-abbreviated" href="mailto:listdiscuss@mpich.org">listdiscuss@mpich.org</a>
<br>
To manage subscription options or unsubscribe:
<br>
<a class="moz-txt-link-freetext" href="https://lists.mpich.org/mailman/listinfo/discuss">https://lists.mpich.org/mailman/listinfo/discuss</a>
<br>
</blockquote>
<br>
<br>
<br>
_______________________________________________
<br>
discuss mailing list <a class="moz-txt-link-abbreviated" href="mailto:discuss@mpich.org">discuss@mpich.org</a>
<br>
To manage subscription options or unsubscribe:
<br>
<a class="moz-txt-link-freetext" href="https://lists.mpich.org/mailman/listinfo/discuss">https://lists.mpich.org/mailman/listinfo/discuss</a>
<br>
<br>
</blockquote>
<br>
<br>
<br>
_______________________________________________
<br>
discuss mailing <a class="moz-txt-link-abbreviated" href="mailto:listdiscuss@mpich.org">listdiscuss@mpich.org</a>
<br>
To manage subscription options or unsubscribe:
<br>
<a class="moz-txt-link-freetext" href="https://lists.mpich.org/mailman/listinfo/discuss">https://lists.mpich.org/mailman/listinfo/discuss</a>
<br>
</blockquote>
<br>
<br>
<br>
_______________________________________________
<br>
discuss mailing list <a class="moz-txt-link-abbreviated" href="mailto:discuss@mpich.org">discuss@mpich.org</a>
<br>
To manage subscription options or unsubscribe:
<br>
<a class="moz-txt-link-freetext" href="https://lists.mpich.org/mailman/listinfo/discuss">https://lists.mpich.org/mailman/listinfo/discuss</a>
<br>
<br>
</blockquote>
<br>
<br>
<fieldset class="mimeAttachmentHeader"></fieldset>
<br>
<pre wrap="">_______________________________________________
discuss mailing list <a class="moz-txt-link-abbreviated" href="mailto:discuss@mpich.org">discuss@mpich.org</a>
To manage subscription options or unsubscribe:
<a class="moz-txt-link-freetext" href="https://lists.mpich.org/mailman/listinfo/discuss">https://lists.mpich.org/mailman/listinfo/discuss</a></pre>
</blockquote>
<br>
</body>
</html>