[mpich-discuss] Fortran 2008 MPI Interface Bug

Zhou, Hui zhouh at anl.gov
Thu Nov 21 11:12:51 CST 2019


Hi Pedro Ricardo,

I have added some follow up for the reported issue at https://github.com/pmodels/mpich/issues/4170.

gfortran 9.2 confirms your observation.

Intel compiler, ifort, works correctly.

I will try further narrow down the issue when I find some time; but it appears this is an issue in gfortran implementation.

The `use_f08` binding can be implemented with a thin Fortran layer, essentially wrapping around the `F77` binding, or it can be implemented nearly entirely with Fortran iso-c-binding, wrapping around a native mpi c library. MPICH adopts the latter approach.

If you are interested, you may refer to this paper: https://www.mcs.anl.gov/~balaji/pubs/2014/eurompi/eurompi14.f08binding.pdf

—
Hui Zhou









On Nov 13, 2019, at 4:45 PM, Zhou, Hui via discuss <discuss at mpich.org<mailto:discuss at mpich.org>> wrote:

Hi Pedro Ricardo,

Thanks for reporting the issue. I have opened a GitHub issue, https://github.com/pmodels/mpich/issues/4170, for better tracking.

`mpi_f08` tries to use the iso-c-binding as much as it can and it has not been receiving as much test as `use mpi`. One of the reason is the compiler support has been sporadic. May I ask which compiler (include version) were you using?

Anyway, we’ll look into it and hopefully get it fixed shortly.

—
Hui Zhou
Principal Software Development Specialist
MCS | ANL



On Nov 13, 2019, at 1:59 PM, Pedro Ricardo Souza via discuss <discuss at mpich.org<mailto:discuss at mpich.org>> wrote:

Hi there, My name is Pedro Ricardo and I'm currently a researcher at a Fluid Dynamics Laboratory in Brazil.
We have a CFD software with 12 continuous years of development mostly in Fortran, and we've been using Mpich and Openmpi since the very beginning.

Recently we started to refactor the code and write it using Fortran2008 standard and this includes the usage of the the newest MPI module for Fortran "mpi_f08".

So far no errors were present in Openmpi implementations and everything worked like a charm, but when we started compiling with Mpich lots of errors happened during execution.

I was able so isolate and reproduce the error in a simpler code. Whenever "use mpi_f08" is present, the arrays passed to a MPI subroutine return with C-like indexes. If I allocate an standard array in Fortran (that starts with index 1), after the call of some MPI routine this array come back starting with index 0.
And you can imagine that all access to this array will be wrong after that.
It is a funny behavior because if you put  "use mpi" instead of "use mpi_f08", the indexes do not change.

The test is simple, allocate and array, print it's bound indexes, calls a MPI_Reduce routine and print the array bounds again. I've also tested other routines like MPI_ISend and the error also happens.

program check_bounds
    !Check Works
    ! use mpi
    !Check Fails
    use mpi_f08

    implicit none
    integer:: mpi_ierr, proc_id, n_proc
    real, allocatable, dimension(:):: t1, t1_0

    call MPI_Init(mpi_ierr)
    call MPI_Comm_rank(MPI_COMM_WORLD, proc_id, mpi_ierr)
    call MPI_Comm_size(MPI_COMM_WORLD, n_proc, mpi_ierr)

    allocate(t1(5), t1_0(5))
    t1 = 1.0; t1_0 = 1.0

    if (proc_id==0) then
        write(*,*)'Array Configuration Before:'
        write(*,'(a6,2(1x,i2))')'Low:',lbound(t1),lbound(t1_0)
        write(*,'(a6,2(1x,i2))')'Upper:',ubound(t1),ubound(t1_0)
    end if

    CALL MPI_REDUCE(t1, t1_0, 5, MPI_REAL, MPI_SUM, 0, MPI_COMM_WORLD, mpi_ierr)

    if (proc_id==0) then
        write(*,*)'Array Configuration After:'
        write(*,'(a6,2(1x,i2))')'Low:',lbound(t1),lbound(t1_0)
        write(*,'(a6,2(1x,i2))')'Upper:',ubound(t1),ubound(t1_0)
    end if

    deallocate(t1, t1_0)
    call MPI_Finalize(mpi_ierr)

end program check_bounds


Thanks for the attention,

_______________________________________________
discuss mailing list     discuss at mpich.org<mailto:discuss at mpich.org>
To manage subscription options or unsubscribe:
https://lists.mpich.org/mailman/listinfo/discuss

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20191121/1f5bf4ea/attachment-0001.html>


More information about the discuss mailing list