Kavli Affiliate: Xiang Zhang
| First 5 Authors: Tianchun Wan, Wei Cheng, Dongsheng Luo, Wenchao Yu, Jingchao Ni
| Summary:
Personalized Federated Learning (PFL) which collaboratively trains a
federated model while considering local clients under privacy constraints has
attracted much attention. Despite its popularity, it has been observed that
existing PFL approaches result in sub-optimal solutions when the joint
distribution among local clients diverges. To address this issue, we present
Federated Modular Network (FedMN), a novel PFL approach that adaptively selects
sub-modules from a module pool to assemble heterogeneous neural architectures
for different clients. FedMN adopts a light-weighted routing hypernetwork to
model the joint distribution on each client and produce the personalized
selection of the module blocks for each client. To reduce the communication
burden in existing FL, we develop an efficient way to interact between the
clients and the server. We conduct extensive experiments on the real-world test
beds and the results show both the effectiveness and efficiency of the proposed
FedMN over the baselines.
| Search Query: ArXiv Query: search_query=au:”Xiang Zhang”&id_list=&start=0&max_results=10