[mpich-discuss] Error with MPI_Spawn

Jeff Hammond jeff.science at gmail.com
Sun Jun 23 17:03:09 CDT 2013


This is the wrong way to use PETSc and to parallelize a code with a
parallel library in general.

Write the PETSc user list and they will explain to you how to
parallelize your code properly with PETSc.

Jeff

Sent from my iPhone

On Jun 23, 2013, at 4:59 PM, Nitel Muhtaroglu <muhtaroglu.n at gmail.com> wrote:

> Hello,
>
> I am trying to integrate PETSc library to a serial program. The idea is that the serial program creates a linear equation system and then calls PETSc solver by MPI_Spawn and then solves this system in parallel. But when I execute MPI_Spawn the following error message occurs and the solver is not called. I couldn't find a solution to this error. Does anyone have an idea about it?
>
> Kind Regards,
> --
> Nitel
>
> **********************************************************
> Assertion failed in file socksm.c at line 590: hdr.pkt_type == MPIDI_NEM_TCP_SOCKSM_PKT_ID_INFO || hdr.pkt_type == MPIDI_NEM_TCP_SOCKSM_PKT_TMPVC_INFO
> internal ABORT - process 0
> INTERNAL ERROR: Invalid error class (66) encountered while returning from
> MPI_Init.  Please file a bug report.
> Fatal error in MPI_Init: Unknown error.  Please file a bug report., error stack:
> (unknown)(): connection failure
> [cli_0]: aborting job:
> Fatal error in MPI_Init: Unknown error.  Please file a bug report., error stack:
> (unknown)(): connection failure
> **********************************************************
> _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss



More information about the discuss mailing list