Kavli Affiliate: Feng Wang
| First 5 Authors: Guoyizhe Wei, Feng Wang, Anshul Shah, Rama Chellappa,
| Summary:
Federated learning is a distributed machine learning paradigm that allows
multiple clients to collaboratively train a shared model with their local data.
Nonetheless, conventional federated learning algorithms often struggle to
generalize well due to the ubiquitous domain shift across clients. In this
work, we consider a challenging yet realistic federated learning scenario where
the training data of each client originates from different domains. We address
the challenges of domain shift by leveraging the technique of prompt learning,
and propose a novel method called Federated Dual Prompt Tuning (Fed-DPT).
Specifically, Fed-DPT employs a pre-trained vision-language model and then
applies both visual and textual prompt tuning to facilitate domain adaptation
over decentralized data. Extensive experiments of Fed-DPT demonstrate its
significant effectiveness in domain-aware federated learning. With a
pre-trained CLIP model (ViT-Base as image encoder), the proposed Fed-DPT
attains 68.4% average accuracy over six domains in the DomainNet dataset, which
improves the original CLIP by a large margin of 14.8%.
| Search Query: ArXiv Query: search_query=au:”Feng Wang”&id_list=&start=0&max_results=3