[mpich-discuss] checking build system type... config.sub: missing argument
Nestor Waldyd Alvarez Villa
nestor.alvarez at alfa.upb.edu.co
Thu Sep 10 19:49:21 CDT 2015
Hello,
I was using the following configure command:
configure --prefix=/usr/local \
--enable-fast=O2 \
--enable-fortran=yes --enable-cxx \
--with-device=ch3:nemesis \
--with-pm=hydra \
--enable-romio \
--enable-debuginfo \
--enable-versioning \
--enable-threads=multiple --enable-thread-cs=global --enable-refcount=none \
--enable-mutex-timing --enable-handle-allocation=mutex \
--with-shared-memory=auto \
--with-java=/opt/java/jdk1.8.0_51 \
CC=pgcc CFLAGS=-I"/opt/pgi/linux86-64/15.7/include/CC" MPICHLIB_CFLAGS=-I"/opt/pgi/linux86-64/15.7/include/CC" \
CXX=pgCC \ `#CXXFLAGS= MPICHLIB_CXXFLAGS=` \
F77=pgf77 \ `#FFLAGS= MPICHLIB_FFLAGS=` \
FC=pgfortran \ `#FCFLAGS= MPICHLIB_FCFLAGS=` \
`#LDFLAGS= MPICHLIB_LDFLAGS= # Library, -L<library>` \
`#LIBS= MPICHLIB_LIBS= # linker, -l<library>` \
CPPFLAGS="/opt/pgi/linux86-64/15.7/include/CC" \ `# # header, -I<include dir>`
Then after omitting the default configure options:
~/mpich-3.1.4/configure --with-java=/opt/java/jdk1.8.0_51 \
CC=pgcc \
CXX=pgCC \
F77=pgf77 \
FC=pgfortran
The configure script worked just well.
Best Regards,
--
--------------------------------------------------------------
Nestor Waldyd Alvarez Villa
Ingeniero Electrónico. MSc. C. Telecomunicaciones.
Universidad Pontificia Bolivariana
mailto:nestor.alvarez at alfa.upb.edu.co
Medellín - Antioquía.
COLOMBIA
--------------------------------------------------------------
________________________________________
Von: discuss-request at mpich.org <discuss-request at mpich.org>
Gesendet: Mittwoch, 9. September 2015 06:01
An: discuss at mpich.org
Betreff: discuss Digest, Vol 35, Issue 4
Send discuss mailing list submissions to
discuss at mpich.org
To subscribe or unsubscribe via the World Wide Web, visit
https://lists.mpich.org/mailman/listinfo/discuss
or, via email, send a message with subject or body 'help' to
discuss-request at mpich.org
You can reach the person managing the list at
discuss-owner at mpich.org
When replying, please edit your Subject line so it is more specific
than "Re: Contents of discuss digest..."
Today's Topics:
1. mpich-master-v3.2b4-211-gf91baf0296ce: error spawning
processes (Siegmar Gross)
2. Argonne Spam Quarantine Notification
(Argonne Anti Spam Quarantine)
3. Re: checking build system type... config.sub: missing
argument (Kenneth Raffenetti)
4. Argonne Spam Quarantine Notification
(Argonne Anti Spam Quarantine)
----------------------------------------------------------------------
Message: 1
Date: Tue, 8 Sep 2015 10:09:05 +0200
From: Siegmar Gross <Siegmar.Gross at informatik.hs-fulda.de>
To: discuss at mpich.org
Subject: [mpich-discuss] mpich-master-v3.2b4-211-gf91baf0296ce: error
spawning processes
Message-ID: <55EE97A1.6040103 at informatik.hs-fulda.de>
Content-Type: text/plain; charset="utf-8"; Format="flowed"
Hi,
yesterday I have built mpich-master-v3.2b4-211-gf91baf0296ce on
my machines (Solaris 10 Sparc, Solaris 10 x86_64, and openSUSE
Linux 12.1 x86_64) with gcc-5.1.0 and Sun C 5.13. I get the
following errors if I run small programs that spawn processes
on two Sparc machines. "mpiexec" is aliased to 'mpiexec -genvnone'.
It doesn't matter if I use my cc- or gcc-version of MPICH.
tyr spawn 120 mpiexec -np 1 --host tyr,rs0 spawn_master
Parent process 0 running on tyr.informatik.hs-fulda.de
I create 4 slave processes
Fatal error in MPI_Init: Unknown error class, error stack:
MPIR_Init_thread(472).................:
MPID_Init(302)........................: spawned process group was unable
to connect back to the parent on port
<tag#0$description#tyr$port#40568$ifname#193.174.24.39$>
MPID_Comm_connect(191)................:
MPIDI_Comm_connect(488)...............:
SetupNewIntercomm(1187)...............:
MPIR_Barrier_intra(150)...............:
barrier_smp_intra(96).................:
MPIR_Barrier_impl(332)................: Failure during collective
MPIR_Barrier_impl(327)................:
MPIR_Barrier(292).....................:
MPIR_Barrier_intra(169)...............:
MPIDU_Complete_posted_with_error(1137): Process failed
barrier_smp_intra(111)................:
MPIR_Bcast_impl(1452).................:
MPIR_Bcast(1476)......................:
MPIR_Bcast_intra(1287)................:
MPIR_Bcast_binomial(310)..............: Failure during collective
Fatal error in MPI_Init: Unknown error class, error stack:
MPIR_Init_thread(472)...:
MPID_Init(302)..........: spawned process group was unable to connect
back to the parent on port
<tag#0$description#tyr$port#40568$ifname#193.174.24.39$>
MPID_Comm_connect(191)..:
MPIDI_Comm_connect(488).:
SetupNewIntercomm(1187).:
MPIR_Barrier_intra(150).:
barrier_smp_intra(111)..:
MPIR_Bcast_impl(1452)...:
MPIR_Bcast(1476)........:
MPIR_Bcast_intra(1287)..:
MPIR_Bcast_binomial(310): Failure during collective
tyr spawn 121
I get the following error or something similar to the above error
message with "mpiexec -np 1 --host tyr,rs0 spawn_multiple_master"
and "mpiexec -np 1 --host tyr,rs0 spawn_intra_comm".
tyr spawn 127 mpiexec -np 1 --host tyr,rs0 spawn_multiple_master
Parent process 0 running on tyr.informatik.hs-fulda.de
I create 3 slave processes.
Fatal error in MPI_Comm_spawn_multiple: Unknown error class, error stack:
MPI_Comm_spawn_multiple(162)..: MPI_Comm_spawn_multiple(count=2,
cmds=ffffffff7fffdf08, argvs=ffffffff7fffdef8,
maxprocs=ffffffff7fffdef0, infos=ffffffff7fffdee8, root=0,
MPI_COMM_WORLD, intercomm=ffffffff7fffdee4, errors=0) failed
MPIDI_Comm_spawn_multiple(274):
MPID_Comm_accept(153).........:
MPIDI_Comm_accept(1057).......:
MPIR_Bcast_intra(1287)........:
MPIR_Bcast_binomial(310)......: Failure during collective
===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= PID 14925 RUNNING AT rs0
= EXIT CODE: 10
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
tyr spawn 128
Sometimes I also get this error message.
tyr spawn 129 mpiexec -np 1 --host tyr,rs0 spawn_multiple_master
Parent process 0 running on tyr.informatik.hs-fulda.de
I create 3 slave processes.
===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= PID 11444 RUNNING AT tyr
= EXIT CODE: 10
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
[proxy:0:0 at tyr.informatik.hs-fulda.de] HYD_pmcd_pmip_control_cmd_cb
(../../../../mpich-master-v3.2b4-211-gf91baf0296ce/src/pm/hydra/pm/pmiserv/pmip_cb.c:885):
assert (!closed) failed
[proxy:0:0 at tyr.informatik.hs-fulda.de] HYDT_dmxu_poll_wait_for_event
(../../../../mpich-master-v3.2b4-211-gf91baf0296ce/src/pm/hydra/tools/demux/demux_poll.c:76):
callback returned error status
[proxy:0:0 at tyr.informatik.hs-fulda.de] main
(../../../../mpich-master-v3.2b4-211-gf91baf0296ce/src/pm/hydra/pm/pmiserv/pmip.c:206):
demux engine error waiting for event
[proxy:1:1 at rs0.informatik.hs-fulda.de] HYD_pmcd_pmip_control_cmd_cb
(../../../../mpich-master-v3.2b4-211-gf91baf0296ce/src/pm/hydra/pm/pmiserv/pmip_cb.c:885):
assert (!closed) failed
[proxy:1:1 at rs0.informatik.hs-fulda.de] HYDT_dmxu_poll_wait_for_event
(../../../../mpich-master-v3.2b4-211-gf91baf0296ce/src/pm/hydra/tools/demux/demux_poll.c:76):
callback returned error status
[proxy:1:1 at rs0.informatik.hs-fulda.de] main
(../../../../mpich-master-v3.2b4-211-gf91baf0296ce/src/pm/hydra/pm/pmiserv/pmip.c:206):
demux engine error waiting for event
[mpiexec at tyr.informatik.hs-fulda.de] HYDT_bscu_wait_for_completion
(../../../../mpich-master-v3.2b4-211-gf91baf0296ce/src/pm/hydra/tools/bootstrap/utils/bscu_wait.c:75):
one of the processes terminated badly; aborting
[mpiexec at tyr.informatik.hs-fulda.de] HYDT_bsci_wait_for_completion
(../../../../mpich-master-v3.2b4-211-gf91baf0296ce/src/pm/hydra/tools/bootstrap/src/bsci_wait.c:23):
launcher returned error waiting for completion
[mpiexec at tyr.informatik.hs-fulda.de] HYD_pmci_wait_for_completion
(../../../../mpich-master-v3.2b4-211-gf91baf0296ce/src/pm/hydra/pm/pmiserv/pmiserv_pmci.c:218):
launcher returned error waiting for completion
[mpiexec at tyr.informatik.hs-fulda.de] main
(../../../../mpich-master-v3.2b4-211-gf91baf0296ce/src/pm/hydra/ui/mpich/mpiexec.c:344):
process manager error waiting for completion
tyr spawn 130
Sometimes it even works.
tyr spawn 131 mpiexec -np 1 --host tyr,rs0 spawn_multiple_master
Parent process 0 running on tyr.informatik.hs-fulda.de
I create 3 slave processes.
Parent process 0: tasks in MPI_COMM_WORLD: 1
tasks in COMM_CHILD_PROCESSES local group: 1
tasks in COMM_CHILD_PROCESSES remote group: 3
Slave process 2 of 3 running on rs0.informatik.hs-fulda.de
Slave process 0 of 3 running on rs0.informatik.hs-fulda.de
Slave process 1 of 3 running on tyr.informatik.hs-fulda.de
spawn_slave 0: argv[0]: spawn_slave
spawn_slave 0: argv[1]: program type 1
spawn_slave 2: argv[0]: spawn_slave
spawn_slave 2: argv[1]: program type 2
spawn_slave 2: argv[2]: another parameter
spawn_slave 1: argv[0]: spawn_slave
spawn_slave 1: argv[1]: program type 2
spawn_slave 1: argv[2]: another parameter
tyr spawn 132
It seems that the programs work fine on my x86_64 machines.
At least I wasn't able to produce an error.
tyr spawn 121 ssh linpc1
linpc1 fd1026 107 mpiexec -np 1 --host sunpc0,linpc1 spawn_master
Parent process 0 running on sunpc0
I create 4 slave processes
Parent process 0: tasks in MPI_COMM_WORLD: 1
tasks in COMM_CHILD_PROCESSES local group: 1
tasks in COMM_CHILD_PROCESSES remote group: 4
Slave process 0 of 4 running on linpc0
Slave process 2 of 4 running on linpc0
Slave process 1 of 4 running on sunpc0
Slave process 3 of 4 running on sunpc0
spawn_slave 1: argv[0]: spawn_slave
spawn_slave 3: argv[0]: spawn_slave
spawn_slave 0: argv[0]: spawn_slave
spawn_slave 2: argv[0]: spawn_slave
linpc1 fd1026 102 mpiexec -np 1 --host sunpc0,linpc0 spawn_multiple_master
Parent process 0 running on sunpc0
I create 3 slave processes.
Parent process 0: tasks in MPI_COMM_WORLD: 1
tasks in COMM_CHILD_PROCESSES local group: 1
tasks in COMM_CHILD_PROCESSES remote group: 3
Slave process 0 of 3 running on linpc0
Slave process 2 of 3 running on linpc0
Slave process 1 of 3 running on sunpc0
spawn_slave 0: argv[0]: spawn_slave
spawn_slave 0: argv[1]: program type 1
spawn_slave 1: argv[0]: spawn_slave
spawn_slave 1: argv[1]: program type 2
spawn_slave 1: argv[2]: another parameter
spawn_slave 2: argv[0]: spawn_slave
spawn_slave 2: argv[1]: program type 2
spawn_slave 2: argv[2]: another parameter
linpc1 fd1026 103
linpc1 fd1026 103 mpiexec -np 1 --host sunpc0,linpc0 spawn_intra_comm
Parent process 0: I create 2 slave processes
Parent process 0 running on sunpc0
MPI_COMM_WORLD ntasks: 1
COMM_CHILD_PROCESSES ntasks_local: 1
COMM_CHILD_PROCESSES ntasks_remote: 2
COMM_ALL_PROCESSES ntasks: 3
mytid in COMM_ALL_PROCESSES: 0
Child process 1 running on sunpc0
MPI_COMM_WORLD ntasks: 2
COMM_ALL_PROCESSES ntasks: 3
mytid in COMM_ALL_PROCESSES: 2
Child process 0 running on linpc0
MPI_COMM_WORLD ntasks: 2
COMM_ALL_PROCESSES ntasks: 3
mytid in COMM_ALL_PROCESSES: 1
linpc1 fd1026 104
Kind regards
Siegmar
-------------- next part --------------
A non-text attachment was scrubbed...
Name: smime.p7s
Type: application/pkcs7-signature
Size: 5164 bytes
Desc: S/MIME Cryptographic Signature
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20150908/9209c1ea/attachment-0001.bin>
------------------------------
Message: 2
Date: 08 Sep 2015 06:01:09 -0500
From: Argonne Anti Spam Quarantine <spam-quarantine at anl.gov>
To: discuss at mpich.org
Subject: [mpich-discuss] Argonne Spam Quarantine Notification
Message-ID: <f3748f$dc80740=e1407eaf1682491a at mailgateway.anl.gov>
Content-Type: text/plain; charset="utf-8"
++ Note: This message has been sent by a notification only system. Please do not reply ++
IronPort Spam Quarantine Notification
If a spam message was missed, or if a message was incorrectly flagged as spam, you can add the sender to a personal safelist or blocklist via the quarantine. Senders on the safelist will not be flagged as spam, and senders on the blocklist will be automatically quarantined. For more information, visit:
https://wiki.inside.anl.gov/inside/Ironport_Anti-Spam_Appliances/Documentation_for_Users
The message(s) below have been quarantined by the Argonne National Laboratory spam filtering system.
There are 1 new messages in your Email Quarantine since you received your last Spam Quarantine Notification. If the messages below are spam, you do not need to take any action. Messages will be automatically deleted from the quarantine after 28 day(s).
If any of the messages below are not spam, click the 'Release' link to have them sent to your Inbox.
Messages that have been correctly flagged as spam do not need to be reported.
----------- New Quarantine Messages ---------------
Message 1
From: "jane" <ab at a.wandaec.net>
Subject: *****SPAM*****jane
Date: 07 Sep 2015
Release: http://mailgateway.anl.gov:8080/Message?action=Release&mid=92936610&h=ab3683608a018a4c2928e9fb562f2b2e&email=discuss%40mpich.org
---------------------------------------------------
To manage your quarantine please visit the URL below:
http://mailgateway.anl.gov:8080/Search?h=30dbb0a1c0b4b0fc77bfe24644601714&email=discuss%40mpich.org
If you have any questions regarding the Laboratory spam filtering product, please contact the CIS helpdesk at help at anl.gov or x-9999 option 2.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20150908/8b514fe7/attachment-0001.html>
------------------------------
Message: 3
Date: Tue, 8 Sep 2015 08:52:38 -0500
From: Kenneth Raffenetti <raffenet at mcs.anl.gov>
To: <discuss at mpich.org>
Subject: Re: [mpich-discuss] checking build system type... config.sub:
missing argument
Message-ID: <55EEE826.10801 at mcs.anl.gov>
Content-Type: text/plain; charset="utf-8"; format=flowed
Have you tried building with the system provided gcc/g++/gfortran?
You're configure line looks okay. I want to eliminate PGI compilers from
the potential problem list.
Ken
On 09/04/2015 04:43 PM, Nestor Waldyd Alvarez Villa wrote:
> Hello,
>
>
> I am trying to configure mpich 3.1.4 on Linux 4.1.3-201.fc22.x86_64 #1
> SMP without success. The following was prompted:
>
>
> checking the archiver (ar) interface... ar
>
> checking build system type... config.sub: missing argument
>
> Try `config.sub --help' for more information.
>
> configure: error: /bin/sh /home/waldyd/mpich-3.1.4/confdb/config.sub
> failed
>
>
> Attached it is the config.log file. The error lines are:
>
> configure:6811: ar cru libconftest.a conftest.o >&5
> ar: `u' modifier ignored since `D' is the default (see `U')
> configure:6814: $? = 0
> configure:6842: result: ar
> configure:6892: checking build system type
> configure:6903: error: /bin/sh
> /home/waldyd/mpich-3.1.4/confdb/config.sub failed
>
> How can i solve this issue?
>
> Best regards,
>
>
> --
> --------------------------------------------------------------
>
> Nestor Waldyd Alvarez Villa
> Ingeniero Electr?nico. MSc. C. Telecomunicaciones.
> Universidad Pontificia Bolivariana
> mailto:nestor.alvarez at alfa.upb.edu.co
> Medell?n - Antioqu?a.
> COLOMBIA
> --------------------------------------------------------------
>
>
> _______________________________________________
> discuss mailing list discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss
>
------------------------------
Message: 4
Date: 09 Sep 2015 06:01:06 -0500
From: Argonne Anti Spam Quarantine <spam-quarantine at anl.gov>
To: discuss at mpich.org
Subject: [mpich-discuss] Argonne Spam Quarantine Notification
Message-ID: <f3748f$1217a52=4a7dedfea13fa3e4 at mailgateway.anl.gov>
Content-Type: text/plain; charset="utf-8"
++ Note: This message has been sent by a notification only system. Please do not reply ++
IronPort Spam Quarantine Notification
If a spam message was missed, or if a message was incorrectly flagged as spam, you can add the sender to a personal safelist or blocklist via the quarantine. Senders on the safelist will not be flagged as spam, and senders on the blocklist will be automatically quarantined. For more information, visit:
https://wiki.inside.anl.gov/inside/Ironport_Anti-Spam_Appliances/Documentation_for_Users
The message(s) below have been quarantined by the Argonne National Laboratory spam filtering system.
There are 1 new messages in your Email Quarantine since you received your last Spam Quarantine Notification. If the messages below are spam, you do not need to take any action. Messages will be automatically deleted from the quarantine after 28 day(s).
If any of the messages below are not spam, click the 'Release' link to have them sent to your Inbox.
Messages that have been correctly flagged as spam do not need to be reported.
----------- New Quarantine Messages ---------------
Message 1
From: "Raymond" <tyb at a.gjyck.com>
Subject: *****SPAM*****best quality u8 smart watch now only $10/pc for 100pcs
Date: 08 Sep 2015
Release: http://mailgateway.anl.gov:8080/Message?action=Release&mid=93076059&h=f2dcb4a5f6e39e10a38cd346b41a4d55&email=discuss%40mpich.org
---------------------------------------------------
To manage your quarantine please visit the URL below:
http://mailgateway.anl.gov:8080/Search?h=30dbb0a1c0b4b0fc77bfe24644601714&email=discuss%40mpich.org
If you have any questions regarding the Laboratory spam filtering product, please contact the CIS helpdesk at help at anl.gov or x-9999 option 2.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.mpich.org/pipermail/discuss/attachments/20150909/a3d52820/attachment.html>
------------------------------
_______________________________________________
discuss mailing list
discuss at mpich.org
https://lists.mpich.org/mailman/listinfo/discuss
End of discuss Digest, Vol 35, Issue 4
**************************************
_______________________________________________
discuss mailing list discuss at mpich.org
To manage subscription options or unsubscribe:
https://lists.mpich.org/mailman/listinfo/discuss
More information about the discuss
mailing list