<html xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<meta name="Generator" content="Microsoft Word 15 (filtered medium)">
<!--[if !mso]><style>v\:* {behavior:url(#default#VML);}
o\:* {behavior:url(#default#VML);}
w\:* {behavior:url(#default#VML);}
.shape {behavior:url(#default#VML);}
</style><![endif]--><style><!--
/* Font Definitions */
@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0in;
font-size:11.0pt;
font-family:"Calibri",sans-serif;}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:#0563C1;
text-decoration:underline;}
code
{mso-style-priority:99;
font-family:"Courier New";}
span.EmailStyle19
{mso-style-type:personal-reply;
font-family:"Calibri",sans-serif;
color:windowtext;}
.MsoChpDefault
{mso-style-type:export-only;
font-size:10.0pt;}
@page WordSection1
{size:8.5in 11.0in;
margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
{page:WordSection1;}
--></style><!--[if gte mso 9]><xml>
<o:shapedefaults v:ext="edit" spidmax="1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext="edit">
<o:idmap v:ext="edit" data="1" />
</o:shapelayout></xml><![endif]-->
</head>
<body lang="EN-US" link="#0563C1" vlink="#954F72" style="word-wrap:break-word">
<div class="WordSection1">
<p class="MsoNormal">I’m clueless. Here is the correct output, running mpiexec inside of sbatch.<o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<p class="MsoNormal">Kurt<o:p></o:p></p>
<p class="MsoNormal"><o:p> </o:p></p>
<div>
<div style="border:none;border-top:solid #E1E1E1 1.0pt;padding:3.0pt 0in 0in 0in">
<p class="MsoNormal"><b>From:</b> Zhou, Hui <zhouh@anl.gov> <br>
<b>Sent:</b> Friday, February 18, 2022 2:00 PM<br>
<b>To:</b> Raffenetti, Ken <raffenet@anl.gov>; discuss@mpich.org<br>
<b>Cc:</b> Mccall, Kurt E. (MSFC-EV41) <kurt.e.mccall@nasa.gov><br>
<b>Subject:</b> [EXTERNAL] Re: [mpich-discuss] MPI_Init hangs under Slurm<o:p></o:p></p>
</div>
</div>
<p class="MsoNormal"><o:p> </o:p></p>
<div>
<p class="MsoNormal"><span style="font-size:12.0pt;color:black">Hi Kurt,<o:p></o:p></span></p>
</div>
<div>
<p class="MsoNormal"><span style="font-size:12.0pt;color:black"><o:p> </o:p></span></p>
</div>
<div>
<p class="MsoNormal"><span style="font-size:12.0pt;color:black">Did you run </span>
<code><span style="font-size:10.0pt;color:black">mpiexec</span></code><span style="font-size:12.0pt;color:black"> inside
</span><code><span style="font-size:10.0pt;color:black">sbatch</span></code><span style="font-size:12.0pt;color:black">? It will need
</span><code><span style="font-size:10.0pt;color:black">sbatch</span></code><span style="font-size:12.0pt;color:black"> to allocate the nodes.<o:p></o:p></span></p>
</div>
<div>
<p class="MsoNormal"><span style="font-size:12.0pt;color:black"><o:p> </o:p></span></p>
</div>
<div>
<p class="MsoNormal"><span style="font-size:12.0pt;color:black">-- <o:p></o:p></span></p>
</div>
<div>
<p class="MsoNormal"><span style="font-size:12.0pt;color:black">Hui<o:p></o:p></span></p>
</div>
<div class="MsoNormal" align="center" style="text-align:center">
<hr size="2" width="98%" align="center">
</div>
<div id="divRplyFwdMsg">
<p class="MsoNormal"><b><span style="color:black">From:</span></b><span style="color:black"> Mccall, Kurt E. (MSFC-EV41) via discuss <<a href="mailto:discuss@mpich.org">discuss@mpich.org</a>><br>
<b>Sent:</b> Friday, February 18, 2022 1:39 PM<br>
<b>To:</b> Raffenetti, Ken <<a href="mailto:raffenet@anl.gov">raffenet@anl.gov</a>>;
<a href="mailto:discuss@mpich.org">discuss@mpich.org</a> <<a href="mailto:discuss@mpich.org">discuss@mpich.org</a>><br>
<b>Cc:</b> Mccall, Kurt E. (MSFC-EV41) <<a href="mailto:kurt.e.mccall@nasa.gov">kurt.e.mccall@nasa.gov</a>><br>
<b>Subject:</b> Re: [mpich-discuss] MPI_Init hangs under Slurm</span> <o:p></o:p></p>
<div>
<p class="MsoNormal"> <o:p></o:p></p>
</div>
</div>
<div>
<div>
<p class="MsoNormal" style="margin-bottom:12.0pt">Here is the --verbose output. Is it trying to launch the processes all on the head node rocci.ndc.nasa.gov?<br>
<br>
Kurt<br>
<br>
-----Original Message-----<br>
From: Raffenetti, Ken <<a href="mailto:raffenet@anl.gov">raffenet@anl.gov</a>> <br>
Sent: Friday, February 18, 2022 1:31 PM<br>
To: <a href="mailto:discuss@mpich.org">discuss@mpich.org</a><br>
Cc: Mccall, Kurt E. (MSFC-EV41) <<a href="mailto:kurt.e.mccall@nasa.gov">kurt.e.mccall@nasa.gov</a>><br>
Subject: [EXTERNAL] Re: [mpich-discuss] MPI_Init hangs under Slurm<br>
<br>
From the looks of it, using the ssh launcher might not be able to access all the nodes. To confirm, can you try launching a non-MPI program? Something like<br>
<br>
mpiexec -verbose -launcher ssh -print-all-exitcodes -np 20 -ppn1 hostname<br>
<br>
Ken<br>
<br>
On 2/17/22, 2:39 PM, "Mccall, Kurt E. (MSFC-EV41) via discuss" <<a href="mailto:discuss@mpich.org">discuss@mpich.org</a>> wrote:<br>
<br>
Sorry, my attachment with an .out extension was blocked. Here is the file with a .txt extension.<br>
<br>
From: Mccall, Kurt E. (MSFC-EV41) <<a href="mailto:kurt.e.mccall@nasa.gov">kurt.e.mccall@nasa.gov</a>>
<br>
Sent: Thursday, February 17, 2022 2:36 PM<br>
To: <a href="mailto:discuss@mpich.org">discuss@mpich.org</a><br>
Cc: Mccall, Kurt E. (MSFC-EV41) <<a href="mailto:kurt.e.mccall@nasa.gov">kurt.e.mccall@nasa.gov</a>><br>
Subject: MPI_Init hangs under Slurm<br>
<br>
<br>
<br>
Things were working fine when I was launching 1 node jobs under Slurm 20.11.8, but when I launched a 20 node job, MPICH hangs in MPI_Init. The output of “mpiexec -verbose” is attached, and the stack trace at the point where it hangs is below.<br>
<br>
In the “mpiexec -verbose” output, I wonder why variables such as PATH_modshare point to our Intel MPI implementation, which I am no using. I am using MPICH 4.0 with a patch that Ken Raffenetti provided (which makes MPICH recognize the “host” info key).
My $PATH and $LD_LIBRARY_PATH variables definitely point to the correct MPICH installation.<br>
<br>
I appreciate any help you can give.<br>
<br>
<br>
Here is the Slurm sbatch command:<br>
<br>
sbatch --nodes=20 --ntasks=20 --job-name $job_name --exclusive –verbose <br>
<br>
<br>
Here is the mpiexec command:<br>
<br>
mpiexec -verbose -launcher ssh -print-all-exitcodes -np 20 -wdir ${work_dir} -env DISPLAY localhost:10.0 --ppn 1 <many more args…><br>
<br>
<br>
Stack trace at MPI_Init:<br>
<br>
#0 0x00007f6d85f499b2 in read () from /lib64/libpthread.so.0<br>
#1 0x00007f6d87a5753a in PMIU_readline (fd=5, buf=buf@entry=0x7ffd6fb596e0 "", maxlen=maxlen@entry=1024)<br>
at ../mpich-slurm-patch-4.0/src/pmi/simple/simple_pmiutil.c:134<br>
#2 0x00007f6d87a57a56 in GetResponse (request=0x7f6d87b48351 "cmd=barrier_in\n",<br>
expectedCmd=0x7f6d87b48345 "barrier_out", checkRc=0) at ../mpich-slurm-patch-4.0/src/pmi/simple/simple_pmi.c:818<br>
#3 0x00007f6d87a29915 in MPIDI_PG_SetConnInfo (rank=rank@entry=0,<br>
connString=connString@entry=0x1bbf5a0 "description#n001$port#33403$ifname#172.16.56.1$")<br>
at ../mpich-slurm-patch-4.0/src/mpid/ch3/src/mpidi_pg.c:559<br>
#4 0x00007f6d87a38611 in MPID_nem_init (pg_rank=pg_rank@entry=0, pg_p=pg_p@entry=0x1bbf850, has_parent=<optimized out>)<br>
at ../mpich-slurm-patch-4.0/src/mpid/ch3/channels/nemesis/src/mpid_nem_init.c:393<br>
#5 0x00007f6d87a2ad93 in MPIDI_CH3_Init (has_parent=<optimized out>, pg_p=0x1bbf850, pg_rank=0)<br>
at ../mpich-slurm-patch-4.0/src/mpid/ch3/channels/nemesis/src/ch3_init.c:83<br>
#6 0x00007f6d87a1b3b7 in init_world () at ../mpich-slurm-patch-4.0/src/mpid/ch3/src/mpid_init.c:190<br>
#7 MPID_Init (requested=<optimized out>, provided=provided@entry=0x7f6d87e03540 <MPIR_ThreadInfo>)<br>
at ../mpich-slurm-patch-4.0/src/mpid/ch3/src/mpid_init.c:76<br>
#8 0x00007f6d879828eb in MPII_Init_thread (argc=argc@entry=0x7ffd6fb5a5cc, argv=argv@entry=0x7ffd6fb5a5c0,<br>
user_required=0, provided=provided@entry=0x7ffd6fb5a574, p_session_ptr=p_session_ptr@entry=0x0)<br>
at ../mpich-slurm-patch-4.0/src/mpi/init/mpir_init.c:208<br>
#9 0x00007f6d879832a5 in MPIR_Init_impl (argc=0x7ffd6fb5a5cc, argv=0x7ffd6fb5a5c0)<br>
at ../mpich-slurm-patch-4.0/src/mpi/init/mpir_init.c:93<br>
#10 0x00007f6d8786388e in PMPI_Init (argc=0x7ffd6fb5a5cc, argv=0x7ffd6fb5a5c0)<br>
at ../mpich-slurm-patch-4.0/src/binding/c/init/init.c:46<br>
#11 0x000000000040640d in main (argc=23, argv=0x7ffd6fb5ad68) at src/NeedlesMpiManagerMain.cpp:53<o:p></o:p></p>
</div>
</div>
</div>
</body>
</html>