Kavli Affiliate: Yi Zhou
| First 5 Authors: Ziyi Chen, Qunwei Li, Yi Zhou, ,
| Summary:
Standard gradient descent-ascent (GDA)-type algorithms can only find
stationary points in nonconvex minimax optimization, which are far more
sub-optimal than local minimax points. In this work, we develop GDA-type
algorithms that globally converge to local minimax points in
nonconvex-strongly-concave minimax optimization. We first observe that local
minimax points are equivalent to second-order stationary points of a certain
envelope function. Then, inspired by the classic cubic regularization
algorithm, we propose Cubic-GDA–a cubic-regularized GDA algorithm for finding
local minimax points, and provide a comprehensive convergence analysis by
leveraging its intrinsic potential function. Specifically, we establish the
global convergence of Cubic-GDA to a local minimax point at a sublinear
convergence rate. We further analyze the asymptotic convergence rate of
Cubic-GDA in the full spectrum of a local gradient dominant-type nonconvex
geometry, establishing orderwise faster asymptotic convergence rates than the
standard GDA. Moreover, we propose a stochastic variant of Cubic-GDA for
large-scale minimax optimization, and characterize its sample complexity under
stochastic sub-sampling.
| Search Query: ArXiv Query: search_query=au:”Yi Zhou”&id_list=&start=0&max_results=10