Skip to content

Instantly share code, notes, and snippets.

@dasayan05
Created February 27, 2019 09:33
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save dasayan05/7b0dab17bd9de36ca00e39be3300e4d6 to your computer and use it in GitHub Desktop.
Save dasayan05/7b0dab17bd9de36ca00e39be3300e4d6 to your computer and use it in GitHub Desktop.
Basic usage of All-reduce
def main(rank, world):
if rank == 0:
x = torch.tensor([1.])
elif rank == 1:
x = torch.tensor([2.])
elif rank == 2:
x = torch.tensor([-3.])
dist.all_reduce(x, op=dist.reduce_op.SUM)
print('Rank {} has {}'.format(rank, x))
if __name__ == '__main__':
dist.init_process_group(backend='mpi')
main(dist.get_rank(), dist.get_world_size())
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment