Federated Contrastive Learning: Maximizing Global Mutual Information for Unsupervised and Semi-Supervised Representation Learning
Federated contrastive learning can be formulated as maximizing a lower bound to the global mutual information between representations of two views of the data, which leads to principled extensions of SimCLR to the federated setting for both unsupervised and semi-supervised learning.