[mpich-discuss] cpi

יוסף אלון yos104104 at gmail.com
Wed Mar 23 11:13:17 CDT 2016


tried to cd and do this command but premission denied so i did sudo and
then he said mpicc: command not found.

cluster at elec-cluster-1 ~/mpich2/mpich-3.1/examples $* mpicc -o cpi cpi.c*
/usr/bin/ld: cannot open output file cpi: Permission denied
collect2: ld returned 1 exit status
cluster at elec-cluster-1 ~/mpich2/mpich-3.1/examples $ *sudo mpicc -o cpi
cpi.c*
[sudo] password for cluster:
sudo: mpicc: command not found

2016-03-23 18:09 GMT+02:00 sanjesh pant <spant3474 at gmail.com>:

> First compile it by using command as
> mpicc -o cpi cpi.c
> On 23 Mar 2016 21:33, "יוסף אלון" <yos104104 at gmail.com> wrote:
>
>> but i do want to run a C source code all over the nodes?
>> and allso when i run only the cpi witout .c nothing is hapning.
>> i think you the only one who can help me.
>> thank you
>>
>> 2016-03-23 17:59 GMT+02:00 Kenneth Raffenetti <raffenet at mcs.anl.gov>:
>>
>>> The executable MPICH builds (and you want to run) is "cpi" in the
>>> examples. cpi.c is a C source code file and is not executable. I suggest
>>> you read more on how to build and execute C programs if you still have
>>> questions.
>>>
>>> Ken
>>>
>>> On 03/23/2016 10:43 AM, יוסף אלון wrote:
>>>
>>>> what do you mean?
>>>>
>>>>
>>>> 2016-03-23 17:42 GMT+02:00 Kenneth Raffenetti <raffenet at mcs.anl.gov
>>>> <mailto:raffenet at mcs.anl.gov>>:
>>>>
>>>>
>>>>     cpi.c is a source file, not an executable.
>>>>
>>>>     On 03/23/2016 10:41 AM, יוסף אלון wrote:
>>>>
>>>>         when i do this i reciev:
>>>>
>>>>         cluster at elec-cluster-1 ~ $ *_mpiexec -n 5 -f machinefile
>>>>         ./mpich2/mpich-3.1/examples/cpi.c_*
>>>>         [proxy:0:0 at elec-cluster-1] HYDU_create_process
>>>>
>>>> (/home/cluster/mpich2/mpich-3.1/src/pm/hydra/utils/launch/launch.c:75):
>>>>         execvp error on file ./mpich2/mpich-3.1/examples/cpi.c
>>>>         (Permission denied)
>>>>
>>>>
>>>> ===================================================================================
>>>>         =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
>>>>         =   PID 2895 RUNNING AT 147.161.4.200
>>>>         =   EXIT CODE: 255
>>>>         =   CLEANING UP REMAINING PROCESSES
>>>>         =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
>>>>
>>>> ===================================================================================
>>>>         [proxy:0:1 at elec-cluster-2] HYDU_create_process
>>>>
>>>> (/home/cluster/mpich2/mpich-3.1/src/pm/hydra/utils/launch/launch.c:75):
>>>>         execvp error on file ./mpich2/mpich-3.1/examples/cpi.c
>>>>         (Permission denied)
>>>>         [proxy:0:4 at elec-cluster-5] HYDU_create_process
>>>>
>>>> (/home/cluster/mpich2/mpich-3.1/src/pm/hydra/utils/launch/launch.c:75):
>>>>
>>>> ===================================================================================
>>>>         =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
>>>>         =   PID 7382 RUNNING AT 147.161.4.201
>>>>         =   EXIT CODE: 255
>>>>         =   CLEANING UP REMAINING PROCESSES
>>>>         =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
>>>>
>>>> ===================================================================================
>>>>
>>>>         2016-03-23 17:34 GMT+02:00 Kenneth Raffenetti
>>>>         <raffenet at mcs.anl.gov <mailto:raffenet at mcs.anl.gov>
>>>>         <mailto:raffenet at mcs.anl.gov <mailto:raffenet at mcs.anl.gov>>>:
>>>>
>>>>
>>>>              The error in your first message was that mpiexec was unable
>>>>         to find
>>>>              the file "./examples/cpi".
>>>>
>>>>              The error in your second message is that mpiexec was unable
>>>>         to find
>>>>              your "machinefile".
>>>>
>>>>              Please make sure you are giving the correct paths to these
>>>>         files in
>>>>              your mpiexec command.
>>>>
>>>>              Ken
>>>>
>>>>              On 03/23/2016 10:20 AM, יוסף אלון wrote:
>>>>
>>>>                  i am not using a share network and i have it on the
>>>>         place in all
>>>>                  nodes.
>>>>                  and thus file are located in same place on every node.
>>>>                  i tried to run like this allso
>>>>                  תמונה מוטבעת 1
>>>>
>>>>                  2016-03-23 17:06 GMT+02:00 Kenneth Raffenetti
>>>>                  <raffenet at mcs.anl.gov <mailto:raffenet at mcs.anl.gov>
>>>>         <mailto:raffenet at mcs.anl.gov <mailto:raffenet at mcs.anl.gov>>
>>>>                  <mailto:raffenet at mcs.anl.gov
>>>>         <mailto:raffenet at mcs.anl.gov> <mailto:raffenet at mcs.anl.gov
>>>>         <mailto:raffenet at mcs.anl.gov>>>>:
>>>>
>>>>
>>>>
>>>>                       Are you executing your commands from a shared
>>>> network
>>>>                  filesystem? If
>>>>                       not, have you copied your MPI installation and cpi
>>>>         binaries
>>>>                  into the
>>>>                       same location on all the machines in your cluster?
>>>>
>>>>                       Ken
>>>>
>>>>                       On 03/23/2016 09:47 AM, יוסף אלון wrote:
>>>>
>>>>
>>>>                           hiii
>>>>
>>>>                           i am new here and i have a 18 node cluster
>>>>         that works
>>>>                  pretty
>>>>                           good when i
>>>>                           execute the command:
>>>>
>>>>                           *mpiexec -f machinefile -n 18 hostname
>>>>                           * and the folowing output:
>>>>                           elec-cluster-1
>>>>                           elec-cluster-2
>>>>                           elec-cluster-3
>>>>                           elec-cluster-5
>>>>                           elec-cluster-4
>>>>                           elec-cluster-6
>>>>                           elec-cluster-7
>>>>                           elec-cluster-9
>>>>                           elec-cluster-8
>>>>                           elec-cluster-10
>>>>                           elec-cluster-11
>>>>                           elec-cluster-13
>>>>                           elec-cluster-14
>>>>                           elec-cluster-15
>>>>                           elec-cluster-16
>>>>                           elec-cluster-12
>>>>                           elec-cluster-18
>>>>                           elec-cluster-17
>>>>
>>>>                           when i execute the command:
>>>>                           *mpiexec -n 5 -f machinefile ./examples/cpi*
>>>>
>>>>                              nothing is seems to work and i receive:
>>>>
>>>>                           [proxy:0:0 at elec-cluster-1]
>>>> HYDU_create_process
>>>>
>>>>
>>>>
>>>> (/home/cluster/mpich2/mpich-3.1/src/pm/hydra/utils/launch/launch.c:75):
>>>>                           execvp error on file ./examples/cpi (No such
>>>>         file or
>>>>                  directory)
>>>>
>>>>
>>>>
>>>>
>>>> ===================================================================================
>>>>                           =   BAD TERMINATION OF ONE OF YOUR APPLICATION
>>>>         PROCESSES
>>>>                           =   PID 2806 RUNNING AT 147.161.4.200
>>>>                           =   EXIT CODE: 255
>>>>                           =   CLEANING UP REMAINING PROCESSES
>>>>                           =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
>>>>
>>>>
>>>>
>>>> ===================================================================================
>>>>                           [proxy:0:2 at elec-cluster-3]
>>>> HYDU_create_process
>>>>
>>>>
>>>>
>>>> (/home/cluster/mpich2/mpich-3.1/src/pm/hydra/utils/launch/launch.c:75):
>>>>                           execvp error on file ./examples/cpi (No such
>>>>         file or
>>>>                  directory)
>>>>                           [proxy:0:3 at elec-cluster-4]
>>>> HYDU_create_process
>>>>
>>>>
>>>>
>>>> (/home/cluster/mpich2/mpich-3.1/src/pm/hydra/utils/launch/launch.c:75):
>>>>
>>>>
>>>>
>>>> ===================================================================================
>>>>                           =   BAD TERMINATION OF ONE OF YOUR APPLICATION
>>>>         PROCESSES
>>>>                           =   PID 6718 RUNNING AT 147.161.4.202
>>>>                           =   EXIT CODE: 255
>>>>                           =   CLEANING UP REMAINING PROCESSES
>>>>                           =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
>>>>
>>>>
>>>>
>>>> ===================================================================================
>>>>
>>>>                           i dont know what to do?
>>>>                           another think is how to run and compile a c
>>>>         program?
>>>>
>>>>                           --
>>>>                           בברכה, יוסף אלון
>>>>                           050-4916740
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>>                           discuss mailing list discuss at mpich.org
>>>>         <mailto:discuss at mpich.org>
>>>>                  <mailto:discuss at mpich.org <mailto:discuss at mpich.org>>
>>>>         <mailto:discuss at mpich.org <mailto:discuss at mpich.org>
>>>>                  <mailto:discuss at mpich.org <mailto:discuss at mpich.org>>>
>>>>                           To manage subscription options or unsubscribe:
>>>>         https://lists.mpich.org/mailman/listinfo/discuss
>>>>
>>>>                       _______________________________________________
>>>>                       discuss mailing list discuss at mpich.org
>>>>         <mailto:discuss at mpich.org>
>>>>                  <mailto:discuss at mpich.org <mailto:discuss at mpich.org>>
>>>>         <mailto:discuss at mpich.org <mailto:discuss at mpich.org>
>>>>
>>>>                  <mailto:discuss at mpich.org <mailto:discuss at mpich.org>>>
>>>>
>>>>                       To manage subscription options or unsubscribe:
>>>>         https://lists.mpich.org/mailman/listinfo/discuss
>>>>
>>>>
>>>>
>>>>
>>>>                  --
>>>>                  בברכה, יוסף אלון
>>>>                  050-4916740
>>>>
>>>>
>>>>                  _______________________________________________
>>>>                  discuss mailing list discuss at mpich.org
>>>>         <mailto:discuss at mpich.org> <mailto:discuss at mpich.org
>>>>         <mailto:discuss at mpich.org>>
>>>>                  To manage subscription options or unsubscribe:
>>>>         https://lists.mpich.org/mailman/listinfo/discuss
>>>>
>>>>              _______________________________________________
>>>>              discuss mailing list discuss at mpich.org
>>>>         <mailto:discuss at mpich.org> <mailto:discuss at mpich.org
>>>>         <mailto:discuss at mpich.org>>
>>>>              To manage subscription options or unsubscribe:
>>>>         https://lists.mpich.org/mailman/listinfo/discuss
>>>>
>>>>
>>>>
>>>>
>>>>         --
>>>>         בברכה, יוסף אלון
>>>>         050-4916740
>>>>
>>>>
>>>>         _______________________________________________
>>>>         discuss mailing list discuss at mpich.org <mailto:
>>>> discuss at mpich.org>
>>>>         To manage subscription options or unsubscribe:
>>>>         https://lists.mpich.org/mailman/listinfo/discuss
>>>>
>>>>     _______________________________________________
>>>>     discuss mailing list discuss at mpich.org <mailto:discuss at mpich.org>
>>>>     To manage subscription options or unsubscribe:
>>>>     https://lists.mpich.org/mailman/listinfo/discuss
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> בברכה, יוסף אלון
>>>> 050-4916740
>>>>
>>>>
>>>> _______________________________________________
>>>> discuss mailing list     discuss at mpich.org
>>>> To manage subscription options or unsubscribe:
>>>> https://lists.mpich.org/mailman/listinfo/discuss
>>>>
>>>> _______________________________________________
>>> discuss mailing list     discuss at mpich.org
>>> To manage subscription options or unsubscribe:
>>> https://lists.mpich.org/mailman/listinfo/discuss
>>>
>>
>>
>>
>> --
>> בברכה, יוסף אלון
>> 050-4916740
>>
>> _______________________________________________
>> discuss mailing list     discuss at mpich.org
>> To manage subscription options or unsubscribe:
>> https://lists.mpich.org/mailman/listinfo/discuss
>>
>
> _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss
>



-- 
בברכה, יוסף אלון
050-4916740
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20160323/09d9eecb/attachment.html>
-------------- next part --------------
_______________________________________________
discuss mailing list     discuss at mpich.org
To manage subscription options or unsubscribe:
https://lists.mpich.org/mailman/listinfo/discuss


More information about the discuss mailing list