[mpich-discuss] Assertion failed in file src/mpid/ch3/src/ch3u_handle_send_req.c at line 61 (RMA && Derived datatypes)
Balaji, Pavan
balaji at anl.gov
Mon Nov 10 07:53:12 CST 2014
Victor,
We believe this has been fixed in mpich/master. Please download the latest mpich nightly snapshot and give it a try to see if it fixes the issue for you.
FYI, the mpich website is down for maintenance at the moment. It should be back up in a couple of hours (or sooner).
— Pavan
> On Nov 10, 2014, at 6:57 AM, Victor Vysotskiy <victor.vysotskiy at teokem.lu.se> wrote:
>
> Dear Developers,
>
> recently I have mentioned a problem with assertion failed in file src/mpid/ch3/src/ch3u_handle_send_req.c:
>
>
> http://lists.mpich.org/pipermail/discuss/2014-October/003354.html
>
> Finally, I was able to narrow problem to a small piece of code. Enclosed please find the test-bed code. In order to reproduce the problem, please compile and execute it with the following arguments:
>
> mpicc mpi_tvec2_rma.c -o mpi_tvec2_rma
>
> mpirun -np 8 ./mpi_tvec2_rma 80 400000
>
> Assertion failed in file src/mpid/ch3/src/ch3u_handle_send_req.c at line 61: win_ptr->at_completion_counter >= 0
>
> internal ABORT - process 0
>
> ===================================================================================
>
> = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
>
> = PID 12900 RUNNING AT n6
>
> = EXIT CODE: 1
>
> = CLEANING UP REMAINING PROCESSES
>
> = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
>
>
> ===================================================================================
>
>
>
> However, everything works out with "8 400000", and with "16 400000". This problem is reproducible on my 4-core laptop as well as on the 16-core HP SL203S GEN8 compute node. The GCC v4.7.3 and Intel v15.0.0 were used to compile MPICH v3.1.3 on my laptop and on HP SL203S, respectively. Moreover, the MPICH v3.1.3 used includes the '920661c3931' commit by Xin Zhao.
>
>
>
> Could you please comment on this issue? Is it a bug in MPICH, or something is wrong with the test-bed code attached?
>
>
>
> With best regards,
>
> Victor.
>
> <mpi_tvec2_rma.c>_______________________________________________
> discuss mailing list discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss
--
Pavan Balaji ✉️
http://www.mcs.anl.gov/~balaji
_______________________________________________
discuss mailing list discuss at mpich.org
To manage subscription options or unsubscribe:
https://lists.mpich.org/mailman/listinfo/discuss
More information about the discuss
mailing list