FedPCL: Learning to Blend Representations for Federated Prototype Learning

Guodong Long
Jie Ma
Jing Jiang
Lu Liu
Tianyi Zhou
Yue Tan
AAAI (2022)
Google Scholar

Abstract

Federated Learning (FL) is a machine learning paradigm that allows decentralized clients to learn collaboratively without sharing their local data. However, excessive communication and computation demands during the training progresses pose challenges to current frameworks. To alleviate these challenges, we propose a lightweight FL framework where powerful foundation models are used as fixed pre-trained backbones to extract domain-specific features and knowledge is shared via prototypes. The objective of the framework is learning to efficiently blend these features in a personalized manner. To achieve that, we propose a novel algorithm, namely Federated Prototype-wise Contrastive Learning (FedPCL), which allows clients to efficiently share similar high-level semantics via class-wise prototypes without exposing each participant's local model and data. FedPCL encourages clients to capture more class-relevant information from the abundant general representations output by backbones, leading to better representation ability. Extensive experiments show that FedPCL outperforms the state-of-the-art methods on multiple datasets under three different non-IID settings.