Kavli Affiliate: Jia Liu
| First 5 Authors: Haibo Yang, Peiwen Qiu, Jia Liu, Aylin Yener,
| Summary:
This paper considers over-the-air federated learning (OTA-FL). OTA-FL
exploits the superposition property of the wireless medium, and performs model
aggregation over the air for free. Thus, it can greatly reduce the
communication cost incurred in communicating model updates from the edge
devices. In order to fully utilize this advantage while providing comparable
learning performance to conventional federated learning that presumes model
aggregation via noiseless channels, we consider the joint design of
transmission scaling and the number of local iterations at each round, given
the power constraint at each edge device. We first characterize the training
error due to such channel noise in OTA-FL by establishing a fundamental lower
bound for general functions with Lipschitz-continuous gradients. Then, by
introducing an adaptive transceiver power scaling scheme, we propose an
over-the-air federated learning algorithm with joint adaptive computation and
power control (ACPC-OTA-FL). We provide the convergence analysis for
ACPC-OTA-FL in training with non-convex objective functions and heterogeneous
data. We show that the convergence rate of ACPC-OTA-FL matches that of FL with
noise-free communications.
| Search Query: ArXiv Query: search_query=au:”Jia Liu”&id_list=&start=0&max_results=10