Skip to content

Instantly share code, notes, and snippets.

View ramprs's full-sized avatar

Ram ramprs

View GitHub Profile
r"""
A collection of common utilities for distributed training. These are a bunch of
wrappers over utilities from :mod:`torch.distributed` module, but they do not
raise exceptions in absence of distributed training / CPU-only training, and
fall back to sensible default behavior.
"""
from typing import Callable, Dict, Tuple, Union
from loguru import logger
import torch