[mpich-discuss] Maximum number of inter-communicators?

Zhou, Hui zhouh at anl.gov
Sun Oct 24 18:46:08 CDT 2021


Hi Kurt,

There is indeed a limit on maximum number of communicators that you can have, including both intra communicators and inter-communicators. Try free the communicators that you no longer need. In older version of MPICH, there may be additional limit on how many dynamic processes one can connect. If you still hit crash after making sure there isn't too many simultaneous active communicators, could you try the latest release -- http://www.mpich.org/static/downloads/4.0a2/mpich-4.0a2.tar.gz, and see if the issue persist?

--
Hui
________________________________
From: Mccall, Kurt E. (MSFC-EV41) via discuss <discuss at mpich.org>
Sent: Sunday, October 24, 2021 2:37 PM
To: discuss at mpich.org <discuss at mpich.org>
Cc: Mccall, Kurt E. (MSFC-EV41) <kurt.e.mccall at nasa.gov>
Subject: [mpich-discuss] Maximum number of inter-communicators?


Hi,



Based on a paper I read about giving an MPI job some fault tolerance, I’m exclusively connecting my processes with inter-communicators.

I’ve found that if I increase the number of processes beyond a certain point, many processes don’t get created at all and the whole job

crashes.   Am I running up against an operating system limit (like the number of open file descriptors – it is set at 1024), or some sort of

MPICH limit?



If it matters, my process architecture (a tree)  is as follows:  one master process connected to 21 manager processes on 21 other nodes,

and each manager connected to 8 worker processes on the manager’s own node.   This is the largest job I’ve been able to create

without it crashing.    Attempting to increase the number of workers beyond 8 results in a crash.



I’m using MPICH 3.3.2 on Centos 3.10.0.   MPICH was compiled with the Portland Group compiler pgc++ 19.5-0.



Thanks,

Kurt
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20211024/db61dccc/attachment.html>


More information about the discuss mailing list