[mpich-discuss] unable to bind socket to port

Balaji, Pavan balaji at anl.gov
Fri May 16 19:48:40 CDT 2014


This doesn’t look like an error with multiple initializations.  If you do that MPICH will throw an error saying "Cannot call MPI_INIT or MPI_INIT_THREAD more than once”.

It looks like you might not be using SLEPc correctly.  Perhaps ask for help on their mailing list?

  — Pavan

On May 16, 2014, at 6:56 PM, 张国熙 <altriaex86 at gmail.com> wrote:

> Hi, all
> 
> I'm using SLEPc as an eigensolver, which implemented MPICH for multiprocess usage. 
> 
> I compile SLEPc code into a .so library B.
> 
> My executable program A calls .so library B, and .so libraryB calls SLEPclibrary.
> I used to put MPI_Init() only in A. But it will got stuck. Then I try to put it in A,B( and C,SLEPc init will call this function).
> 
> INTERNAL ERROR: Invalid error class (59) encountered while returning from
> MPI_Init.  Please file a bug report.
> Fatal error in MPI_Init: Unknown error.  Please file a bug report., error stack:
> (unknown)(): unable to bind socket to port
> INTERNAL ERROR: Invalid error class (59) encountered while returning from
> MPI_Init.  Please file a bug report.
> Fatal error in MPI_Init: Unknown error.  Please file a bug report., error stack:
> (unknown)(): unable to bind socket to port
> [cli_1]: aborting job:
> Fatal error in MPI_Init: Unknown error.  Please file a bug report., error stack:
> (unknown)(): unable to bind socket to port
> INTERNAL ERROR: Invalid error class (59) encountered while returning from
> MPI_Init.  Please file a bug report.
> INTERNAL ERROR: Invalid error class (59) encountered while returning from
> MPI_Init.  Please file a bug report.
> Fatal error in MPI_Init: Unknown error.  Please file a bug report., error stack:
> (unknown)(): unable to bind socket to port
> [cli_3]: aborting job:
> Fatal error in MPI_Init: Unknown error.  Please file a bug report., error stack:
> (unknown)(): unable to bind socket to port
> [cli_0]: aborting job:
> Fatal error in MPI_Init: Unknown error.  Please file a bug report., error stack:
> (unknown)(): unable to bind socket to port
> Fatal error in MPI_Init: Unknown error.  Please file a bug report., error stack:
> (unknown)(): unable to bind socket to port
> [cli_2]: aborting job:
> Fatal error in MPI_Init: Unknown error.  Please file a bug report., error stack:
> (unknown)(): unable to bind socket to port
> 
> ===================================================================================
> =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
> =   EXIT CODE: 1
> =   CLEANING UP REMAINING PROCESSES
> =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
> ===================================================================================
> root at altria-Aspire-5830TG:/home/altria/software/rememode-1.1alpha/Examples/c++# /home/altria/software/petsc-3.4.4/arch-linux2-c-debug/bin/mpiexec -np 4 ./fd_guide
> INTERNAL ERROR: Invalid error class (59) encountered while returning from
> MPI_Init.  Please file a bug report.
> INTERNAL ERROR: Invalid error class (59) encountered while returning from
> MPI_Init.  Please file a bug report.
> Fatal error in MPI_Init: Unknown error.  Please file a bug report., error stack:
> (unknown)(): unable to bind socket to port
> [cli_1]: aborting job:
> Fatal error in MPI_Init: Unknown error.  Please file a bug report., error stack:
> (unknown)(): unable to bind socket to port
> INTERNAL ERROR: Invalid error class (59) encountered while returning from
> MPI_Init.  Please file a bug report.
> Fatal error in MPI_Init: Unknown error.  Please file a bug report., error stack:
> (unknown)(): unable to bind socket to port
> [cli_2]: aborting job:
> Fatal error in MPI_Init: Unknown error.  Please file a bug report., error stack:
> (unknown)(): unable to bind socket to port
> 
> ===================================================================================
> =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
> =   EXIT CODE: 1
> =   CLEANING UP REMAINING PROCESSES
> =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
> ===================================================================================
> 
> I think it is because I called MPI_Init for more than one times. Because if I only call it in A, and C(SLEPc call), code in A,B will be executed, but stuck at C. If I call it in A,B,C. Code in A will be executed, and stop at B.
> 
> I think even if I call MPI_Init for more than one times, it should return to me something like "MPI_Init has been called". Do anyone know how to fix it?Thanks.
> 
> Your sincerely,
> Guoxi
> _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss



More information about the discuss mailing list