diffdist
is a python library for pytorch. It extends the default functionality of torch.autograd
and adds support for differentiable communication between processes. This enables backpropagation to work in distributed settings and makes it super easy to use distributed model parallelism!
After installing pytorch
install simply using:
$ pip install diffdist
GNU GPLv3