HFedKD
is a meta-algorithm for model-Heteregenous Federated learning (FL) using Knowledge Distillation. It is designed to handles two key constraints while incurring minimal communication cost:
- The distribution of training samples across clients is non-iid and imbalanced (ie. some clients may have more samples than others.)
- Each client may have a different variation of a base model. For example, different clients may be training different versions of the VGG model.
We also provide implementations for isolated and clustered federated learning as baselines for comparison. All three algorithms are evaluated on the AG News and CIFAR10 datasets using the CharCNN and VGG family architectures, respectively.