Speaker: Prof. Quoc Tran – Dinh
Affiliation: University of North Carolina at Chapel Hill
Abstract: In this talk, we demonstrate one way of exploiting smooth structures hidden in convex functions to develop optimization algorithms. Our key idea is to generalize a powerful concept so-called “self-concordance” introduced by Y. Nesterov and A. Nemirovskii to a broader class of convex functions. We show that this structure covers many applications in statistics and machine learning. Then, we develop a unified theory for designing numerical methods, especially proximal Newton-type methods. We illustrate our theory through two classes of proximal-based methods: standard proximal Newton schemes and homotopy proximal variable-metric algorithms. By using a homotopy strategy, we can show that the proposed homotopy methods can achieve global linear convergence rates for at least three subclasses of composite convex problems. We provide some numerical examples in different fields to illustrate our theoretical development. We emphasize that the proposed theory can further be applied to develop other methods as long as the underlying model is involved with a “generalized self-concordant structure.”
Biography: Quoc Tran-Dinh is currently an assistant professor at the Department of Statistics and Operations Research, University of North Carolina at Chapel Hill. He obtained his Bachelor at Vietnam National University in Hanoi, and his Ph.D. at the Department of Electrical Engineering and Optimization in Engineering Center at KU Leuven, Belgium. He was a postdoctoral researcher for three years at the Ecole Polytechnique Federale de Lausanne (EPFL), Switzerland, before joining UNC-Chapel Hill. His research mainly focuses on numerical methods for continuous optimization including convex and nonconvex optimization and applications.
Date(s) - Oct 03, 2019
2:00 pm - 3:30 pm
E-IV Faraday Room #67-124
420 Westwood Plaza - 6th Flr., Los Angeles CA 90095