[mpich-discuss] Better alternatives of MPI_Allreduce()

Joachim Protze protze at itc.rwth-aachen.de
Tue May 5 02:00:20 CDT 2020


it is important to understand, that most of the time you see is not the 
cost of the allreduce, but the cost of synchronization (caused by load 

You can do an easy experiment and add a barrier before the allreduce. 
Then you will see the actual cost of the allreduce, while the cost of 
synchronization will go into the barrier.

Now, think about dependencies in your algorithm: do you need the output 
value immediately? Is this the same time, where you have the input value 
-> otherwise use non-blocking communication and perform independent work 
in between

In any case: fix your load imbalance (the root cause of synchronization 


Am 05.05.20 um 07:38 schrieb hritikesh semwal via discuss:
> Hello all,
> I am working on the development of a parallel CFD solver and I am using 
> MPI_Allreduce for the global summation of the local errors calculated on 
> all processes of a group and the summation is to be used by all the 
> processes. My concern is that MPI_Allreduce is taking almost 27-30% of 
> the total time used, which is a significant amount. So, I want to ask if 
> anyone can suggest me better alternative/s to replace MPI_Allreduce 
> which can reduce the time consumption.
> Thank you.
> _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss

Dipl.-Inf. Joachim Protze

IT Center
Group: High Performance Computing
Division: Computational Science and Engineering
RWTH Aachen University
Seffenter Weg 23
D 52074  Aachen (Germany)
Tel: +49 241 80- 24765
Fax: +49 241 80-624765
protze at itc.rwth-aachen.de

-------------- next part --------------
A non-text attachment was scrubbed...
Name: smime.p7s
Type: application/pkcs7-signature
Size: 5327 bytes
Desc: S/MIME Cryptographic Signature
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20200505/f09fd7f5/attachment-0001.p7s>

More information about the discuss mailing list