[mpich-discuss] credentials for mpiexec -f machinefile

"Antonio J. Peña" apenya at mcs.anl.gov
Wed Apr 23 16:10:25 CDT 2014


No, what I meant was doing "ssh localhost", but now I look at your logs 
again, it may be a problem of your DNS configuration.

mpi-966f395e-bbb1-4a20-8cbc-c10081e91244 seems to be unable to resolve the IP address of mpi-be6bebee-55e3-4901-a5bb-637395ba46f6. Take a look at your /etc/hosts files in both hosts.



On 04/23/2014 03:57 PM, Jan Balewski wrote:
> yes, from  IP=212 I can do:
> $ ssh cosy11 at 198.125.163.207
> Last login: Wed Apr 23 16:41:08 2014 from oswrk212.lns.mit.edu
>
> Jan
>
>
> On Apr 23, 2014, at 4:54 PM, "Antonio J. Peña" <apenya at mcs.anl.gov> wrote:
>
>> You need to be able to ssh to the local host password-less as well. Is that working for you?
>>
>>
>> On 04/23/2014 03:50 PM, Jan Balewski wrote:
>>> On Apr 23, 2014, at 4:41 PM, "Antonio J. Peña" <apenya at mcs.anl.gov>
>>>   wrote:
>>>
>>>
>>>> We use to encourage users to set up password-less ssh access by means of private/public key access. There are plenty of web pages with tutorials for this setup. This is just a random one: http://stackoverflow.com/questions/7260/how-do-i-setup-public-key-authentication
>>>>
>>>>
>>>>    Antonio
>>>>
>>> Thanks Antonio.
>>>   I just did that.
>>>
>>> Now I'm able to  use this setup:
>>> $ cat machinefile
>>> 198.125.163.212:2
>>>
>>> to run on my login VM with IP=212:
>>> $ mpiexec -f machinefile -n 2 ./examples/cpi
>>> Process 0 of 2 is on mpi-be6bebee-55e3-4901-a5bb-637395ba46f6
>>> Process 1 of 2 is on mpi-be6bebee-55e3-4901-a5bb-637395ba46f6
>>> pi is approximately 3.1415926535899388, Error is 0.0000000000001457
>>> wall clock time = 0.003566
>>>
>>> But if I add the remote host to the machine list:
>>> $ cat machinefile2
>>> 198.125.163.212:2
>>> 198.125.163.207:2
>>>
>>> Then some socket is not connected (see attached dump)
>>> Do I need to enable some specific ports so IP2012 talks to IP=207?
>>>
>>> Is it just port 42548  ? TPC? UDP? a range of ports?
>>> Do they need to be open on both the master & the worker?
>>> Jan
>>>
>>> -----
>>> [cosy11 at mpi-be6bebee-55e3-4901-a5bb-637395ba46f6 mpich-3.1]$ mpiexec -f machinefile2 -n 4 ./examples/cpi
>>> [proxy:0:1 at mpi-966f395e-bbb1-4a20-8cbc-c10081e91244] HYDU_sock_connect (utils/sock/sock.c:138): unable to get host address for mpi-be6bebee-55e3-4901-a5bb-637395ba46f6 (1)
>>> [proxy:0:1 at mpi-966f395e-bbb1-4a20-8cbc-c10081e91244] main (pm/pmiserv/pmip.c:189): unable to connect to server mpi-be6bebee-55e3-4901-a5bb-637395ba46f6 at port 42548 (check for firewalls!)
>>> ^C[mpiexec at mpi-be6bebee-55e3-4901-a5bb-637395ba46f6] Sending Ctrl-C to processes as requested
>>> [mpiexec at mpi-be6bebee-55e3-4901-a5bb-637395ba46f6] Press Ctrl-C again to force abort
>>> [mpiexec at mpi-be6bebee-55e3-4901-a5bb-637395ba46f6] HYDU_sock_write (utils/sock/sock.c:286): write error (Bad file descriptor)
>>> [mpiexec at mpi-be6bebee-55e3-4901-a5bb-637395ba46f6] HYD_pmcd_pmiserv_send_signal (pm/pmiserv/pmiserv_cb.c:169): unable to write data to proxy
>>> [mpiexec at mpi-be6bebee-55e3-4901-a5bb-637395ba46f6] ui_cmd_cb (pm/pmiserv/pmiserv_pmci.c:79): unable to send signal downstream
>>> [mpiexec at mpi-be6bebee-55e3-4901-a5bb-637395ba46f6] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status
>>> [mpiexec at mpi-be6bebee-55e3-4901-a5bb-637395ba46f6] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:198): error waiting for event
>>> [mpiexec at mpi-be6bebee-55e3-4901-a5bb-637395ba46f6] main (ui/mpich/mpiexec.c:336): process manager error waiting for completion
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>
>
> _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss


-- 
Antonio J. Peña
Postdoctoral Appointee
Mathematics and Computer Science Division
Argonne National Laboratory
9700 South Cass Avenue, Bldg. 240, Of. 3148
Argonne, IL 60439-4847
apenya at mcs.anl.gov
www.mcs.anl.gov/~apenya

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20140423/b5a10beb/attachment.html>


More information about the discuss mailing list