[mpich-discuss] MPI_Get on the same memory location

Boisvert, Sebastien boisvert at anl.gov
Thu Aug 21 19:31:53 CDT 2014


> ________________________________________
> From: Balaji, Pavan [balaji at anl.gov]
> Sent: Thursday, August 21, 2014 7:27 PM
> To: discuss at mpich.org
> Subject: Re: [mpich-discuss] MPI_Get on the same memory location
> I don’t think so.  It’s an incorrect program to get from a buffer to itself.  MPICH is being helpful is catching this user error.

This is also my opinion.

> — Pavan
> On Aug 21, 2014, at 7:10 PM, Nick Radcliffe <nradclif at cray.com> wrote:
> > I didn't mean there was a bug in memcpy. I meant the bug was in MPI_Get, which should indeed be checking that pointers are not aliased.
> >
> > -Nick
> >
> > From: Brian Van Straalen [bvstraalen at lbl.gov]
> > Sent: Thursday, August 21, 2014 6:21 PM
> > To: discuss at mpich.org
> > Subject: Re: [mpich-discuss] MPI_Get on the same memory location
> >
> > It depends on what version of memcpy you are using.  If you are calling IEEE std 1003.1 2004 then memcpy is defined as
> >
> > void *memcpy(void *restrict s1, const void *restrict s2, size_t n);
> >
> > so this form of calling memcpy is illegal and the error from memcpy is correct.
> >
> > The routine calling memcpy should verify you are not aliased, or incorrect behavior is very likely on newer architectures.  It would be better to check for aliased data before calling memcpy.
> >
> > Brian Van Straalen
> >
> >
> > On Aug 21, 2014, at 3:47 PM, Nick Radcliffe <nradclif at cray.com> wrote:
> >
> >>> MPIR_Localcopy(357): memcpy arguments alias each other, dst=0x19a5f40
> >> src=0x19a5f40 len=4
> >>
> >> It looks like memcpy is doing a check to make sure the source and destination buffers don't overlap. This seems like a bug to me -- when doing an MPI_Get from a buffer to itself, the implementation should probably just do nothing and return.
> >>
> >> -Nick
> >>
> >> ________________________________________
> >> From: alessandro.fanfarillo at gmail.com [alessandro.fanfarillo at gmail.com] on behalf of Alessandro Fanfarillo [fanfarillo at ing.uniroma2.it]
> >> Sent: Thursday, August 21, 2014 5:25 PM
> >> To: discuss at mpich.org
> >> Subject: [mpich-discuss] MPI_Get on the same memory location
> >>
> >> Dear all,
> >>
> >> I'm having the following error:
> >>
> >> Fatal error in MPI_Get: Internal MPI error!, error stack:
> >> MPI_Get(156).......: MPI_Get(origin_addr=0x19a5f40, origin_count=4,
> >> MPI_BYTE, target_rank=0, target_disp=0, target_count=4, MPI_BYTE,
> >> win=0xa0000000) failed
> >> MPIDI_Get(247).....:
> >> MPIR_Localcopy(357): memcpy arguments alias each other, dst=0x19a5f40
> >> src=0x19a5f40 len=4
> >>
> >> if I try to execute MPI_Get on the same memory location on a shared
> >> memory machine (my laptop).
> >>
> >> I cannot find anything in the standard that denies it for the one-sided.
> >>
> >> Running with OpenMPI everything works fine.
> >>
> >> Is it a bug or I missed something in the standard?
> >>
> >> Thanks.
> >>
> >> Alessandro
> >>
> >> --
> >>
> >> Alessandro Fanfarillo
> >> Dip. di Ingegneria Civile ed Ingegneria Informatica
> >> Università di Roma "Tor Vergata"
> >> NCAR Office: +1 (303) 497-2442
> >> Tel: +39-06-7259 7719
> >> _______________________________________________
> >> discuss mailing list     discuss at mpich.org
> >> To manage subscription options or unsubscribe:
> >> https://lists.mpich.org/mailman/listinfo/discuss
> >> _______________________________________________
> >> discuss mailing list     discuss at mpich.org
> >> To manage subscription options or unsubscribe:
> >> https://lists.mpich.org/mailman/listinfo/discuss
> >
> >
> >
> >
> > _______________________________________________
> > discuss mailing list     discuss at mpich.org
> > To manage subscription options or unsubscribe:
> > https://lists.mpich.org/mailman/listinfo/discuss
> --
> Pavan Balaji  ✉️
> http://www.mcs.anl.gov/~balaji
> _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss


More information about the discuss mailing list