Kavli Affiliate: Jia Liu
| First 5 Authors: Haibo Yang, Zhuqing Liu, Jia Liu, Chaosheng Dong, Michinari Momma
| Summary:
In recent years, multi-objective optimization (MOO) emerges as a foundational
problem underpinning many multi-agent multi-task learning applications.
However, existing algorithms in MOO literature remain limited to centralized
learning settings, which do not satisfy the distributed nature and data privacy
needs of such multi-agent multi-task learning applications. This motivates us
to propose a new federated multi-objective learning (FMOL) framework with
multiple clients distributively and collaboratively solving an MOO problem
while keeping their training data private. Notably, our FMOL framework allows a
different set of objective functions across different clients to support a wide
range of applications, which advances and generalizes the MOO formulation to
the federated learning paradigm for the first time. For this FMOL framework, we
propose two new federated multi-objective optimization (FMOO) algorithms called
federated multi-gradient descent averaging (FMGDA) and federated stochastic
multi-gradient descent averaging (FSMGDA). Both algorithms allow local updates
to significantly reduce communication costs, while achieving the {em same}
convergence rates as those of their algorithmic counterparts in the
single-objective federated learning. Our extensive experiments also corroborate
the efficacy of our proposed FMOO algorithms.
| Search Query: ArXiv Query: search_query=au:”Jia Liu”&id_list=&start=0&max_results=3