Skip to main content

A superlinearly convergent first-order method for nonsmooth optimization

Vasileios Charisopoulos

Abstract

Nonsmooth optimization problems appear throughout machine learning and signal processing. However, standard first-order methods for nonsmooth optimization can be slow for “poorly conditioned” problems. In this talk, I will present a locally accelerated first-order method that is less sensitive to conditioning and achieves superlinear (i.e., double-exponential) convergence near solutions for a broad family of problems. The algorithm is inspired by Newton’s method for solving nonlinear equations.

Bio

Vasilis is an AI & Science postdoctoral scholar at the University of Chicago Data Science Institute. He is broadly interested in developing numerical optimization methods for machine learning, signal processing and scientific computing. He holds a PhD in Operations Research & Information Engineering from Cornell University, where he was advised by Damek Davis. Vasilis was recognized as a Rising Star in Computational and Data Sciences by the UT Austin Oden Institute in 2023 and received the Cornelia Ye Outstanding Teaching Assistant Award at Cornell University in 2021.

Vasileios Charisopoulos Headshot
Vasileios Charisopoulos
University of Chicago
ECE 125
27 Feb 2024, 10:30am until 11:30am