[mpich-discuss] coupling

fereshteh komijani fereshtehkomijani at gmail.com
Sun Oct 27 04:06:42 CDT 2013


dear
thank for your fast reply.
in mpich2 installed folder (mpich2-install_new), there is not example sub
directory, but in original folder which i download (mpich2-1.4.1p1) there
is such folder

*Code:*
[fkomijani at localhost examples_graphics]$ cd
/home/fkomijani/mpich2-1.4.1p1/examples

[fkomijani at localhost examples]$ mpirun -n 2 ./cpi

Process 0 of 2 is on localhost.localdomain

Process 1 of 2 is on localhost.localdomain

pi is approximately 3.1415926544231318, Error is 0.0000000008333387

wall clock time = 0.000232code:

i attach mpi2.log file
cheers
fereshte



On Sun, Oct 27, 2013 at 12:19 PM, Huiwei Lu <huiweilu at mcs.anl.gov> wrote:

> It seems to me the application has went wrong and called MPI_Abort to exit.
> Have you tried to run some other MPI programs to see whether MPI workg?
> e.g. cpi.c under mpi_src_path/examples/
>
>     mpirun -n p 2 ./cpi
>
> Also, could you please also send us the mpi.log you generated to help
> diagnose the error.
>
> And, if there is no specific reason. You are more than welcomed to use the
> latest release of MPICH, —currently MPICH 3.0.4 at http://www.mpich.org/,
> as it has more new features and gets better supported.
>
> --
> Huiwei Lu
> http://www.mcs.anl.gov/~huiweilu/
>
> On Oct 27, 2013, at 3:39 AM, fereshteh komijani <
> fereshtehkomijani at gmail.com> wrote:
>
> > dear
> > i am a newbie of this forum. i am coupling ROMS and  SWAN model which
> need MCT and MPICH libraries. therefore MCT has been installed by
> > Code:
> > ./configure CC=gcc FC=gfortran CPPFLAGS=-I/home/usr/mct/include
> LDFLAGS=-L/home/usr/mctcal/lib --prefix=/home/usr/mct
> > make install
> > and mpich2 release 1.4.1p1 by
> > Code:
> > /configure FC=gfortran CC=gcc  --prefix=/home/<user>/mpich2_install_new
> 2>&1 |tee c.txt
> > make 2>&1 |tee m.txt
> > make install  2>&1 |tee mi.txt
> > but for running coupled model, when i set 1 cpu for SWAN model and 1 for
> ROMS model (therefore 1 node for each of them) by typing
> > code:
> > mpirun -np 2 ./oceanG coupling_inlet-test.in>mpi.log
> > it replies:
> > application called MPI_Abort(comm=0x84000002, 4) - process 0
> > would you please tell me how i can solved this problem.
> > cheers
> > fereshte
> > **Angel**
> >
> >
> > _______________________________________________
> > discuss mailing list     discuss at mpich.org
> > To manage subscription options or unsubscribe:
> > https://lists.mpich.org/mailman/listinfo/discuss
>
> _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss
>



-- 
***Angel***

**
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20131027/afc31d2b/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: mpi2.log
Type: application/octet-stream
Size: 463 bytes
Desc: not available
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20131027/afc31d2b/attachment.obj>


More information about the discuss mailing list