[mpich-discuss] MPI_Get on the same memory location

Nick Radcliffe nradclif at cray.com
Thu Aug 21 20:42:05 CDT 2014


> If buffers overlap, use memmov and not memcpy:

Good point, you would have to use memmov in general for overlapping source and target buffers. I was thinking of the special case where the source and target buffers are identical.

> I don’t think so.  It’s an incorrect program to get from a buffer to itself.  MPICH is being helpful is catching this user error.

Do you know where in the standard this is specified? I'm having trouble finding it.

-Nick

________________________________________
From: Boisvert, Sebastien [boisvert at anl.gov]
Sent: Thursday, August 21, 2014 7:30 PM
To: discuss at mpich.org
Subject: Re: [mpich-discuss] MPI_Get on the same memory location

> ________________________________________
> From: Nick Radcliffe [nradclif at cray.com]
> Sent: Thursday, August 21, 2014 5:47 PM
> To: discuss at mpich.org
> Subject: Re: [mpich-discuss] MPI_Get on the same memory location
> > MPIR_Localcopy(357): memcpy arguments alias each other, dst=0x19a5f40
> src=0x19a5f40 len=4
> It looks like memcpy is doing a check to make sure the source and destination buffers don't overlap. This seems like a bug to me -- when doing an MPI_Get from a buffer to itself, the implementation should probably just do nothing and return.

I disagree.

If buffers overlap, use memmov and not memcpy:

" If copying takes place between objects that overlap, the behavior is undefined."

--source: http://pubs.opengroup.org/onlinepubs/009695399/functions/memcpy.html

Also:

http://stackoverflow.com/questions/1960991/which-one-to-use-memmove-or-memcpy-when-buffers-dont-overlap



> -Nick
> ________________________________________
> From: alessandro.fanfarillo at gmail.com [alessandro.fanfarillo at gmail.com] on behalf of Alessandro Fanfarillo [fanfarillo at ing.uniroma2.it]
> Sent: Thursday, August 21, 2014 5:25 PM
> To: discuss at mpich.org
> Subject: [mpich-discuss] MPI_Get on the same memory location
> Dear all,
> I'm having the following error:
> Fatal error in MPI_Get: Internal MPI error!, error stack:
> MPI_Get(156).......: MPI_Get(origin_addr=0x19a5f40, origin_count=4,
> MPI_BYTE, target_rank=0, target_disp=0, target_count=4, MPI_BYTE,
> win=0xa0000000) failed
> MPIDI_Get(247).....:
> MPIR_Localcopy(357): memcpy arguments alias each other, dst=0x19a5f40
> src=0x19a5f40 len=4
> if I try to execute MPI_Get on the same memory location on a shared
> memory machine (my laptop).
> I cannot find anything in the standard that denies it for the one-sided.
> Running with OpenMPI everything works fine.
> Is it a bug or I missed something in the standard?
> Thanks.
> Alessandro
> --
> Alessandro Fanfarillo
> Dip. di Ingegneria Civile ed Ingegneria Informatica
> Università di Roma "Tor Vergata"
> NCAR Office: +1 (303) 497-2442
> Tel: +39-06-7259 7719
> _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss
> _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss
_______________________________________________
discuss mailing list     discuss at mpich.org
To manage subscription options or unsubscribe:
https://lists.mpich.org/mailman/listinfo/discuss



More information about the discuss mailing list