[mpich-discuss] code from different version of LINUX

Shuchi Yang yang.shuchi at gmail.com
Wed Nov 26 18:43:01 CST 2014


Yes, I can get right results, but it does not always happen. It happen
sometimes, sometimes it is OK. It is really weird.

On Wed, Nov 26, 2014 at 2:26 PM, Junchao Zhang <jczhang at mcs.anl.gov> wrote:

> Do you run your program on one node with 20 processes, and the result is
> correct except the time is slower?
> If yes, then it seems only one CPU of the node is used, which is weird.
>
>
>
>
> --Junchao Zhang
>
> On Wed, Nov 26, 2014 at 1:22 PM, Shuchi Yang <yang.shuchi at gmail.com>
> wrote:
>
>> I am doing CFD simulation.
>> With MPI, I can split the computational domain to different parts so that
>> each process works on different part. In this case, we can reduce the total
>> computational time.
>>
>> When I tried to run at another different system, it looks all the cpu are
>> working on the whole computational domain. It looks like every CPU is
>> working on the whole computational domain so that the computational
>> efficiency is very low.
>>
>>
>>
>> On Wed, Nov 26, 2014 at 11:45 AM, Junchao Zhang <jczhang at mcs.anl.gov>
>> wrote:
>>
>>> I guess it is not an MPI problem. When you say "every CPU works on all
>>> the data", you need a clear idea of what is the data decomposition in your
>>> code.
>>>
>>>
>>> --Junchao Zhang
>>>
>>> On Wed, Nov 26, 2014 at 11:28 AM, Shuchi Yang <yang.shuchi at gmail.com>
>>> wrote:
>>>
>>>> Thanks for your reply. I am trying it in the way you mentioned.
>>>> But I met one problem is that on my original machine, I can run the
>>>> code with 20 CPUs so that each CPU works on part of the job. But at the new
>>>> machine, it starts the process with 20 CPUS, but every CPU works on all the
>>>> data, so that it looks like it is running 20 times the job at same time. Is
>>>> that because of MPI problem?
>>>> Thanks,
>>>>
>>>> Shuchi
>>>>
>>>> On Wed, Nov 26, 2014 at 9:50 AM, Junchao Zhang <jczhang at mcs.anl.gov>
>>>> wrote:
>>>>
>>>>> You can copy MPI libraries installed on the Ubuntu machine to the Suse
>>>>> machine, then add that path to LD_LIBRARY_PATH on the Suse.
>>>>>
>>>>> --Junchao Zhang
>>>>>
>>>>> On Wed, Nov 26, 2014 at 9:55 AM, Shuchi Yang <yang.shuchi at gmail.com>
>>>>> wrote:
>>>>>
>>>>>> I met some problem.
>>>>>> The question is that
>>>>>> I compile a fortran code at ubuntu, but I need run the code at Suse
>>>>>> Linux, I was always told
>>>>>> *    error while loading shared libraries: libmpifort.so.12: cannot
>>>>>> open shared object file: No such file or directory*
>>>>>>
>>>>>> Furthermore, will this be a problem, I mean, if I compile the code
>>>>>> with mpich-gcc and run the code at another type of Linux?
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> Shuchi
>>>>>>
>>>>>> _______________________________________________
>>>>>> discuss mailing list     discuss at mpich.org
>>>>>> To manage subscription options or unsubscribe:
>>>>>> https://lists.mpich.org/mailman/listinfo/discuss
>>>>>>
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> discuss mailing list     discuss at mpich.org
>>>>> To manage subscription options or unsubscribe:
>>>>> https://lists.mpich.org/mailman/listinfo/discuss
>>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> discuss mailing list     discuss at mpich.org
>>>> To manage subscription options or unsubscribe:
>>>> https://lists.mpich.org/mailman/listinfo/discuss
>>>>
>>>
>>>
>>> _______________________________________________
>>> discuss mailing list     discuss at mpich.org
>>> To manage subscription options or unsubscribe:
>>> https://lists.mpich.org/mailman/listinfo/discuss
>>>
>>
>>
>> _______________________________________________
>> discuss mailing list     discuss at mpich.org
>> To manage subscription options or unsubscribe:
>> https://lists.mpich.org/mailman/listinfo/discuss
>>
>
>
> _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20141126/d1ddcb68/attachment-0001.html>
-------------- next part --------------
_______________________________________________
discuss mailing list     discuss at mpich.org
To manage subscription options or unsubscribe:
https://lists.mpich.org/mailman/listinfo/discuss


More information about the discuss mailing list