[mpich-discuss] unable to bind socket to port
张国熙
altriaex86 at gmail.com
Fri May 16 18:56:10 CDT 2014
Hi, all
I'm using SLEPc as an eigensolver, which implemented MPICH for multiprocess
usage.
I compile SLEPc code into a .so library B.
My executable program A calls .so library B, and .so libraryB calls
SLEPclibrary.
I used to put MPI_Init() only in A. But it will got stuck. Then I try to
put it in A,B( and C,SLEPc init will call this function).
INTERNAL ERROR: Invalid error class (59) encountered while returning from
MPI_Init. Please file a bug report.
Fatal error in MPI_Init: Unknown error. Please file a bug report., error
stack:
(unknown)(): unable to bind socket to port
INTERNAL ERROR: Invalid error class (59) encountered while returning from
MPI_Init. Please file a bug report.
Fatal error in MPI_Init: Unknown error. Please file a bug report., error
stack:
(unknown)(): unable to bind socket to port
[cli_1]: aborting job:
Fatal error in MPI_Init: Unknown error. Please file a bug report., error
stack:
(unknown)(): unable to bind socket to port
INTERNAL ERROR: Invalid error class (59) encountered while returning from
MPI_Init. Please file a bug report.
INTERNAL ERROR: Invalid error class (59) encountered while returning from
MPI_Init. Please file a bug report.
Fatal error in MPI_Init: Unknown error. Please file a bug report., error
stack:
(unknown)(): unable to bind socket to port
[cli_3]: aborting job:
Fatal error in MPI_Init: Unknown error. Please file a bug report., error
stack:
(unknown)(): unable to bind socket to port
[cli_0]: aborting job:
Fatal error in MPI_Init: Unknown error. Please file a bug report., error
stack:
(unknown)(): unable to bind socket to port
Fatal error in MPI_Init: Unknown error. Please file a bug report., error
stack:
(unknown)(): unable to bind socket to port
[cli_2]: aborting job:
Fatal error in MPI_Init: Unknown error. Please file a bug report., error
stack:
(unknown)(): unable to bind socket to port
===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= EXIT CODE: 1
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
root at altria-Aspire-5830TG:/home/altria/software/rememode-1.1alpha/Examples/c++#
/home/altria/software/petsc-3.4.4/arch-linux2-c-debug/bin/mpiexec -np 4
./fd_guide
INTERNAL ERROR: Invalid error class (59) encountered while returning from
MPI_Init. Please file a bug report.
INTERNAL ERROR: Invalid error class (59) encountered while returning from
MPI_Init. Please file a bug report.
Fatal error in MPI_Init: Unknown error. Please file a bug report., error
stack:
(unknown)(): unable to bind socket to port
[cli_1]: aborting job:
Fatal error in MPI_Init: Unknown error. Please file a bug report., error
stack:
(unknown)(): unable to bind socket to port
INTERNAL ERROR: Invalid error class (59) encountered while returning from
MPI_Init. Please file a bug report.
Fatal error in MPI_Init: Unknown error. Please file a bug report., error
stack:
(unknown)(): unable to bind socket to port
[cli_2]: aborting job:
Fatal error in MPI_Init: Unknown error. Please file a bug report., error
stack:
(unknown)(): unable to bind socket to port
===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= EXIT CODE: 1
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
I think it is because I called MPI_Init for more than one times. Because if
I only call it in A, and C(SLEPc call), code in A,B will be executed, but
stuck at C. If I call it in A,B,C. Code in A will be executed, and stop at
B.
I think even if I call MPI_Init for more than one times, it should return
to me something like "MPI_Init has been called". Do anyone know how to fix
it?Thanks.
Your sincerely,
Guoxi
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20140517/e9d6c624/attachment.html>
More information about the discuss
mailing list