Communication-Efficient and Privacy-Preserving Feature-based Federated Transfer Learning

Kavli Affiliate: Feng Wang

| First 5 Authors: Feng Wang, M. Cenk Gursoy, Senem Velipasalar, ,

| Summary:

Federated learning has attracted growing interest as it preserves the
clients’ privacy. As a variant of federated learning, federated transfer
learning utilizes the knowledge from similar tasks and thus has also been
intensively studied. However, due to the limited radio spectrum, the
communication efficiency of federated learning via wireless links is critical
since some tasks may require thousands of Terabytes of uplink payload. In order
to improve the communication efficiency, we in this paper propose the
feature-based federated transfer learning as an innovative approach to reduce
the uplink payload by more than five orders of magnitude compared to that of
existing approaches. We first introduce the system design in which the
extracted features and outputs are uploaded instead of parameter updates, and
then determine the required payload with this approach and provide comparisons
with the existing approaches. Subsequently, we analyze the random shuffling
scheme that preserves the clients’ privacy. Finally, we evaluate the
performance of the proposed learning scheme via experiments on an image
classification task to show its effectiveness.

| Search Query: ArXiv Query: search_query=au:”Feng Wang”&id_list=&start=0&max_results=10

Read More