Kavli Affiliate: Yi Zhou
| First 5 Authors: Yufeng Yang, Erin Tripp, Yifan Sun, Shaofeng Zou, Yi Zhou
| Summary:
Recent studies have shown that many nonconvex machine learning problems meet
a so-called generalized-smooth condition that extends beyond traditional smooth
nonconvex optimization. However, the existing algorithms designed for
generalized-smooth nonconvex optimization encounter significant limitations in
both their design and convergence analysis. In this work, we first study
deterministic generalized-smooth nonconvex optimization and analyze the
convergence of normalized gradient descent under the generalized
Polyak-Lojasiewicz condition. Our results provide a comprehensive understanding
of the interplay between gradient normalization and function geometry. Then,
for stochastic generalized-smooth nonconvex optimization, we propose an
independently-normalized stochastic gradient descent algorithm, which leverages
independent sampling, gradient normalization and clipping to achieve an
$mathcal{O}(epsilon^{-4})$ sample complexity under relaxed assumptions.
Experiments demonstrate the fast convergence of our algorithm.
| Search Query: ArXiv Query: search_query=au:”Yi Zhou”&id_list=&start=0&max_results=3