[mpich-discuss] Speed up issue with CFD solver.

hritikesh semwal hritikesh.semwal at gmail.com
Fri May 8 13:02:14 CDT 2020


What do you mean by logical partition? For 2 partitions it a single slice
and for 4 partitions it is a plus shape slice partitioning. The
computation is decreasing continuously but communication time is increasing
from 0.69% to 8.9% of the total time from 2 to 4 processes. I am using
MPI_Neighbor_alltoallw for exchanging data after forming distributed graph
topology.
On Fri, May 8, 2020 at 8:34 PM Zhou, Hui <zhouh at anl.gov> wrote:

> I would first analyze the logical partition and the effects on computation
> and communications. Do they continuously change from 2 process and up or is
> there a bump?
>
>
>
> --
> Hui Zhou
>
>
>
>
>
> *From: *hritikesh semwal via discuss <discuss at mpich.org>
> *Reply-To: *"discuss at mpich.org" <discuss at mpich.org>
> *Date: *Thursday, May 7, 2020 at 11:19 PM
> *To: *"discuss at mpich.org" <discuss at mpich.org>
> *Cc: *hritikesh semwal <hritikesh.semwal at gmail.com>
> *Subject: *[mpich-discuss] Speed up issue with CFD solver.
>
>
>
> Hello,
>
>
>
> I am developing a parallel CFD solver for unstructured grids with finite
> volume explicit time integration method and the global mesh count is 97336
> cells. I am trying to get the speed up results of my code but something
> strange is happening. Following are my timing results after the second time
> step with the different number of processors;
>
>
>
> 2 processors- 182.284 seconds
>
> 4 processors-  251.7035 seconds
>
> 8 processors- 190.8578 seconds
>
> 16 processors- 129.7989 seconds
>
> 24 processors- 121.0303 seconds
>
> 32 processors- 111.3772 seconds
>
>
>
> I am not getting why there is a sharp rise in time from 2 processors to 4
> processors and after that, it is decreasing. I had run the same code 2
> weeks ago and at that time the results were different from the present
> results and time was continuously decreasing at that time. Could someone
> please help me with this matter by suggesting the probable cause/s of this
> behaviour?
>
>
>
> Thank you.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20200508/24373beb/attachment.html>


More information about the discuss mailing list