Return to Colloquia & Seminar listing
Interplay between Generalization and Optimization via Algorithmic Stability
Mathematics of Data & DecisionsSpeaker: | Yiming Ying, SUNY Albany |
Location: | https://ucdavis.zoom.us/j/97789573467 Zoom |
Start time: | Tue, Nov 7 2023, 1:10PM |
In this talk, I will delve into our analysis of stochastic gradient methods (SGMs), focusing on the interplay between generalization and optimization within the framework of statistical learning theory (SLT) and discuss their applications. The core concept for our study is algorithmic stability which is a notion in SLT to characterize how the output of an ML algorithm changes upon a small perturbation of the training data. Our theoretical studies significantly improved the existing results in the convex case and led to new insights into understanding the generalization of deep neural networks trained by SGD in the non-convex case. I will also discuss how to derive lower bounds for the convergence of existing AUC optimization algorithms which further inspires a new direction for designing efficient algorithms. Additionally, I will touch on extensions to differential privacy and minimax problems.