[mpich-discuss] mpi client/server : problem with mpich3
Kenneth Raffenetti
raffenet at mcs.anl.gov
Mon Sep 21 11:25:21 CDT 2015
If you build MPICH 3.1.4 with the option "--with-namepublisher=file", it
will use the same method as previous versions.
Ken
On 09/21/2015 09:36 AM, PEDRONO Annaig wrote:
> Hello,
>
> I am using MPI Client/Server features to couple 2 codes.
> You can find a short example in attached file.
> In the Makefile, mpif90 is the wrapper for mpich compiled with gfortran.
>
> With mpich2, I manage to run a client/server case without any problem
> with the following script :
>
> mpirun -np 1 ./Serveur > Serveur.log &
> sleep 5
> mpirun -np 1 ./Client > gradym_client.log
>
>
> and I am able to run many coupled simulations on the same computer.
>
>
> With mpich3, the script above doesn't work any more. I have found
> another way to run my coupled case in the mpich3 manual :
>
> #/bin/csh
> set hydraPID=`ps -u $USER | grep hydra_nameserve | cut -d ' ' -f 1`
> kill -9 $hydraPID
>
> /PRODCOM/Ubuntu12.04/hydra/3.1.4/intel-15.0/bin/hydra_nameserver &
>
>
> mpiexec -hosts `hostname` -n 1 -nameserver `hostname` ./Serveur &
>
> sleep 1
>
> mpiexec -hosts `hostname` -n 1 -nameserver `hostname` ./Client
>
> set hydraPID=`ps -u $USER | grep hydra_nameserve | cut -d ' ' -f 1`
> echo $hydraPID
> kill -9 $hydraPID
>
>
> It is more complicated and I can only run one simulation by computer at
> the same time because I can launch only one hydra server.
> Do you know another way to run client/server simulation with mpich3?
> What can I do to run many client server simulation on the same host?
>
> Thanks a lot for your help.
>
> A. Pedrono
>
>
>
>
> _______________________________________________
> discuss mailing list discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss
>
_______________________________________________
discuss mailing list discuss at mpich.org
To manage subscription options or unsubscribe:
https://lists.mpich.org/mailman/listinfo/discuss
More information about the discuss
mailing list