[mpich-discuss] MPICH-3.1.4 with PAMID: Win_lock

Balaji, Pavan balaji at anl.gov
Thu Mar 30 11:54:00 CDT 2017


No, it's not a bug.  PAMID only supports MPI-2.1.  Before you do locks to multiple targets, you need to check if MPI_VERSION >= 3.  MPI-2 only supported locks to a single target.  MPI-3 added multiple lock epochs.

Also, pamid is not supported anymore in MPICH.  We recommend the MPICH-3.3/OFI/BGQ path for Blue Gene.

  -- Pavan

> On Mar 30, 2017, at 10:12 AM, Jeff Hammond <jeff.science at gmail.com> wrote:
> 
> I guess this is a bug but MPICH 3.1.x isn't the basis for the supported MPI on BGQ, so I doubt you will get much traction by reporting it.  IBM made an effort to support MPI-3 with PAMID but it was as an open-source, best effort project, and I recall there were some issues with it, including deadlock in certain asynchronous operations.
> 
> You should try the supported MPI-2.2 implementation on BGQ or try the unsupported OFI-based implementation of MPI-3.
> 
> Disclaimer: The comments above are spoken in my capacity as the person who used to support MPI at ALCF, not my current job role.
> 
> Best,
> 
> Jeff
> 
> On Thu, Mar 30, 2017 at 3:21 AM, Sebastian Rinke <rinke at cs.tu-darmstadt.de> wrote:
> Same window, i.e.:
> 
> Process 0:
> 
> Win_lock(MPI_LOCK_SHARED, target=A, window=win)
> Win_lock(MPI_LOCK_SHARED, target=B, window=win)
> 
> Sebastian
> 
> 
> On 30 Mar 2017, at 06:24, Jeff Hammond <jeff.science at gmail.com> wrote:
> 
>> Same window or different windows?
>> 
>> Jeff
>> 
>> On Wed, Mar 29, 2017 at 5:59 PM Sebastian Rinke <rinke at cs.tu-darmstadt.de> wrote:
>> Dear all,
>> 
>> I have some issue with MPI_Win_lock in MPICH-3.1.4 on Blue Gene/Q.
>> 
>> Here is my example:
>> 
>> Process 0:
>> 
>> Win_lock(MPI_LOCK_SHARED, target=A)
>> Win_lock(MPI_LOCK_SHARED, target=B)
>> 
>> No matter what I use for A and B (given A != B), a process cannot acquire more than one lock
>> at a time.
>> 
>> To my understanding, it should be possible to acquire more than one lock.
>> 
>> Can you confirm this issue?
>> 
>> Thanks,
>> Sebastian
>> _______________________________________________
>> discuss mailing list     discuss at mpich.org
>> To manage subscription options or unsubscribe:
>> https://lists.mpich.org/mailman/listinfo/discuss
>> -- 
>> Jeff Hammond
>> jeff.science at gmail.com
>> http://jeffhammond.github.io/
>> _______________________________________________
>> discuss mailing list     discuss at mpich.org
>> To manage subscription options or unsubscribe:
>> https://lists.mpich.org/mailman/listinfo/discuss
> 
> 
> _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss
> 
> 
> 
> -- 
> Jeff Hammond
> jeff.science at gmail.com
> http://jeffhammond.github.io/
> _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss

_______________________________________________
discuss mailing list     discuss at mpich.org
To manage subscription options or unsubscribe:
https://lists.mpich.org/mailman/listinfo/discuss


More information about the discuss mailing list