[mpich-discuss] mpi_gather slow on a single node
Halim Amer
aamer at anl.gov
Mon Dec 5 10:48:12 CST 2016
Hi Keith,
There is too little information to answer your question.
> 2) The latency changes by > 10x over 100 iterations. Is that normal?
What is the baseline you are comparing against? Do you mean memory
latency? If yes, how do you measure it and from where do you fetch the
data? What is your hardware?
> MPICH configure:
--prefix=/group/astronomy856/ban115/mpich/build-ingest-debug
--enable-error-messages=all --enable-timing=all --enable-g=most
You are trying to understand if there is a performance anomaly, yet you
build MPICH in debugging mode. I suggest building with
*--enable-fast=O3,ndebug* and remove the other flags you supplied.
Halim
www.mcs.anl.gov/~aamer
On 12/1/16 5:10 PM, Keith.Bannister at csiro.au wrote:
> Hi,
>
> I think I have a problem with my supercomputer :-(
>
> I’m new to MPI and HPC.
>
> Below are the results of osu_gather running on a single node. I’m interested in the 16 MB message size.
>
> 1) I expected mpi_gather on a single node to have throughput of roughly the memory bandwidth. Currently it’s averaging my less, i.e. ~ 7 Gbps for the 16MB message size. Adding MPICH_NO_LOCAL=1 gives essentially identical results.
> 2) The latency changes by > 10x over 100 iterations. Is that normal?
>
>
>> mpirun -n 12 ./osu_gather -m 33554432 -f -M 1073741842
>
> # OSU MPI Gather Latency Test v5.3.2
>
> # Size Avg Latency(us) Min Latency(us) Max Latency(us) Iterations
> 1 2.49 0.52 7.51 1000
> 2 2.48 0.48 7.50 1000
> 4 2.58 0.52 7.70 1000
> 8 2.53 0.54 7.61 1000
> 16 2.66 0.54 8.08 1000
> 32 2.86 0.64 8.44 1000
> 64 2.92 0.61 8.59 1000
> 128 3.08 0.74 8.95 1000
> 256 3.27 0.84 9.57 1000
> 512 3.62 0.90 10.97 1000
> 1024 6.80 0.70 20.08 1000
> 2048 8.38 0.76 26.14 1000
> 4096 11.27 1.09 36.73 1000
> 8192 16.37 1.69 55.65 1000
> 16384 32.40 3.24 104.44 100
> 32768 57.54 6.66 174.95 100
> 65536 109.20 18.30 315.05 100
> 131072 201.53 32.24 578.51 100
> 262144 226.00 40.95 631.46 100
> 524288 405.27 72.43 1129.47 100
> 1048576 925.78 148.69 2608.16 100
> 2097152 2066.67 342.48 5678.46 100
> 4194304 4287.71 806.01 11652.06 100
> 8388608 8589.63 1754.01 23237.47 100
> 16777216 19515.15 3522.29 49930.88 100
> 33554432 40325.59 12634.90 98571.13 100
>
>
>> mpichversion
> MPICH Version: 3.2
> MPICH Release date: Wed Nov 11 22:06:48 CST 2015
> MPICH Device: ch3:nemesis
> MPICH configure: --prefix=/group/astronomy856/ban115/mpich/build-ingest-debug --enable-error-messages=all --enable-timing=all --enable-g=most --enable-fortran=no
> MPICH CC: gcc -g -O2
> MPICH CXX: g++ -g -O2
> MPICH F77: gfortran -g
> MPICH FC: gfortran -g
> --
> KEITH BANNISTER
> CSIRO Astronomy and Space Science
> T +61 2 9372 4295
> E keith.bannister at csiro.au
>
>
>
>
>
> _______________________________________________
> discuss mailing list discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss
>
_______________________________________________
discuss mailing list discuss at mpich.org
To manage subscription options or unsubscribe:
https://lists.mpich.org/mailman/listinfo/discuss
More information about the discuss
mailing list