[mpich-discuss] Get universe size no longer supported?

Reuti reuti at staff.uni-marburg.de
Mon Feb 18 08:09:49 CST 2013


Hi Jeff,

Am 18.02.2013 um 15:02 schrieb Jeff Hammond:

> If you need MPI_UNIVERSE_SIZE to be defined, you can set it explicitly
> via "mpiexec -usize MPI_UNIVERSE_SIZE".

thanks for the hint. In particular -usize SYSTEM seems to restore the former behavior.

-- Reuti


> There very well could be a bug in MPICH right now, but there is at
> least a solution at hand and the MPI standard is not violated.
> 
> Jeff
> 
> On Mon, Feb 18, 2013 at 6:44 AM, Reuti <reuti at staff.uni-marburg.de> wrote:
>> Am 18.02.2013 um 13:37 schrieb Jeff Hammond:
>> 
>>> I've informed the appropriate people about the Trac issue.
>> 
>> Thanks.
>> 
>> 
>>> I recall that MPI stipulates only that MPI_UNIVERSE_SIZE exist; it
>>> need not be defined.
>>> 
>>> I think that MPICH might set MPI_UNIVERSE_SIZE in the process manager.
>>> What PM are you using?
>> 
>> I started with a plain `mpiexec -np 4 ./program`, hence it should use Hydra as it's a symbolic link to mpiexec.hydra.
>> 
>> -- Reuti
>> 
>> 
>>> Jeff
>>> 
>>> On Mon, Feb 18, 2013 at 5:52 AM, Reuti <Reuti at staff.uni-marburg.de> wrote:
>>>> Hi,
>>>> 
>>>> a call:
>>>> 
>>>> error=MPI_Comm_get_attr(MPI_COMM_WORLD, MPI_UNIVERSE_SIZE, &universe_sizep, &flag);
>>>> 
>>>> worked in MPICH2 1.4.1p1. But the "flag" returned unsupported in 1.5 and 3.0.2. Was this info removed from being retrieved?
>>>> 
>>>> -- Reuti
>>>> 
>>>> PS: I found http://trac.mpich.org/projects/mpich/ticket/1426 mentioning MPI_UNIVERSE_SIZE but it throws an error:
>>>> 
>>>> TracError: The Trac Environment needs to be upgraded.
>>>> 
>>>> Run "trac-admin /www/trac.mpich.org/projects/mpich upgrade"
>>>> _______________________________________________
>>>> discuss mailing list     discuss at mpich.org
>>>> To manage subscription options or unsubscribe:
>>>> https://lists.mpich.org/mailman/listinfo/discuss
>>> 
>>> 
>>> 
>>> --
>>> Jeff Hammond
>>> Argonne Leadership Computing Facility
>>> University of Chicago Computation Institute
>>> jhammond at alcf.anl.gov / (630) 252-5381
>>> http://www.linkedin.com/in/jeffhammond
>>> https://wiki.alcf.anl.gov/parts/index.php/User:Jhammond
>>> _______________________________________________
>>> discuss mailing list     discuss at mpich.org
>>> To manage subscription options or unsubscribe:
>>> https://lists.mpich.org/mailman/listinfo/discuss
>> 
>> _______________________________________________
>> discuss mailing list     discuss at mpich.org
>> To manage subscription options or unsubscribe:
>> https://lists.mpich.org/mailman/listinfo/discuss
> 
> 
> 
> -- 
> Jeff Hammond
> Argonne Leadership Computing Facility
> University of Chicago Computation Institute
> jhammond at alcf.anl.gov / (630) 252-5381
> http://www.linkedin.com/in/jeffhammond
> https://wiki.alcf.anl.gov/parts/index.php/User:Jhammond
> _______________________________________________
> discuss mailing list     discuss at mpich.org
> To manage subscription options or unsubscribe:
> https://lists.mpich.org/mailman/listinfo/discuss




More information about the discuss mailing list