Unexpected output from mpi4py

I am new to MPI using Python and I have some problems here. This is my code:

from mpi4py import MPI

comm = MPI.COMM_WORLD
rank = comm.Get_rank()

if rank == 0:
        a = 1 
        comm.bcast(a, root=0)
        s = comm.reduce(a, op=MPI.SUM)
        print 'From process 0, sum =', s
elif rank == 1:
        b = 2
        comm.bcast(b, root=1)  
        x = comm.reduce(b, op=MPI.SUM)
        print 'From process 1, sum =', x

I want to print: From process PROCESS_NUMBER, sum = 3

Process 0 prints correctly, but process 1 prints No.

I do not understand why. Can anyone help me?

+4
source share
2 answers
  • Any collective operation ( Bcast, Reduce) must be called on all processes, therefore it is incorrect to place it inside an if rank == N expression.
  • In the second abbreviation you must indicate root=1.
  • Assignment needed in broadcast a = comm.bcast(a, root=0)

Corrected Code:

from mpi4py import MPI

comm = MPI.COMM_WORLD
rank = comm.Get_rank()

if rank == 0:
        a = 1
else:
        a = None
a = comm.bcast(a, root=0)
s = comm.reduce(a, op=MPI.SUM)
if rank == 0:
        print 'From process 0, sum =', s

if rank == 1:
        b = 2
else:
        b = None
b = comm.bcast(b, root=1)
x = comm.reduce(b, op=MPI.SUM, root=1)

if rank == 1:
        print 'From process 1, sum =', x

The result of three processes:

From process 0, sum = 3
From process 1, sum = 6
+3
source

comm.reduce(a, op=MPI.SUM)matches MPI_Reduce(): the amount is available only in the root process.

If you want the amount available for each communicator process, you can use comm.allreduce(a, op=MPI.SUM). He is consistent MPI_Allreduce(). See this page to learn more about the difference between MPI_Reduce()and MPI_Allreduce().

+1
source

All Articles