[mpich-discuss] MCS lock and MPI RMA problem

Jeff Hammond jeff.science at gmail.com
Mon Mar 6 12:20:47 CST 2017


What processor architecture are you testing?

Maybe set lmem to volatile or read it with MPI_Fetch_and_op rather than a
load.  MPI_Win_sync cannot prevent the compiler from caching *lmem in a
register.

Jeff

On Sat, Mar 4, 2017 at 12:30 AM, Ask Jakobsen <afj at qeye-labs.com> wrote:

> Hi,
>
> I have downloaded the source code for the MCS lock from the excellent book
> "Using Advanced MPI" from http://www.mcs.anl.gov/research/projects/mpi/
> usingmpi/examples-advmpi/rma2/mcs-lock.c
>
> I have made a very simple piece of test code for testing the MCS lock but
> it works at random and often never escapes the busy loops in the acquire
> and release functions (see attached source code). The code appears
> semantically correct to my eyes.
>
> #include <stdio.h>
> #include <mpi.h>
> #include "mcs-lock.h"
>
> int main(int argc, char *argv[])
> {
>   MPI_Win win;
>   MPI_Init( &argc, &argv );
>
>   MCSLockInit(MPI_COMM_WORLD, &win);
>
>   int rank, size;
>   MPI_Comm_rank(MPI_COMM_WORLD, &rank);
>   MPI_Comm_size(MPI_COMM_WORLD, &size);
>
>   printf("rank: %d, size: %d\n", rank, size);
>
>
>   MCSLockAcquire(win);
>   printf("rank %d aquired lock\n", rank);   fflush(stdout);
>   MCSLockRelease(win);
>
>
>   MPI_Win_free(&win);
>   MPI_Finalize();
>   return 0;
> }
>
>
> I have tested on several hardware platforms and mpich-3.2 and mpich-3.3a2
> but with no luck.
>
> It appears that the MPI_Win_Sync are not "refreshing" the local data or I
> have a bug I can't spot.
>
> A simple unfair lock like http://www.mcs.anl.gov/research/projects/mpi/
> usingmpi/examples-advmpi/rma2/ga_mutex1.c works perfectly.
>
> Best regards, Ask Jakobsen
>
>
> _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss
>



-- 
Jeff Hammond
jeff.science at gmail.com
http://jeffhammond.github.io/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20170306/f1c86859/attachment.html>
-------------- next part --------------
_______________________________________________
discuss mailing list     discuss at mpich.org
To manage subscription options or unsubscribe:
https://lists.mpich.org/mailman/listinfo/discuss


More information about the discuss mailing list