Tuesday, November 14, 2017 at 4:15pm
Numerous tasks in applied math and data science lead to minimizing a composition of a finite convex function with a smooth nonlinear map. I will discuss various aspects of this problem class, including efficiency of first-order methods, stochastic algorithms for large finite sums, fast local rates of convergence, and termination criteria. In the second part of the talk, I will specialize the aforementioned techniques to the phase retrieval problem. I will explain how the composite framework allows one to determine high probability regions devoid of stationary points; these are the regions where the landscape of the objective function is benign. Building on the recent work of Duchi and Ruan '17, I will then explain how one can harness the rapid convergence guarantees of proximal and subgradient-type methods to devise globally efficient algorithms for the phase retrieval problem.
Joint work with Damek Davis (Cornell), Alexander D. Ioffe (Technion), Adrian S. Lewis (Cornell), and Courtney Paquette (Lehigh).
Dmitriy Drusvyatskiy received his Ph.D. from the School of Operations Research and Information Engineering at Cornell University in 2013, followed by a post-doctoral appointment in the Combinatorics and Optimization department at the University of Waterloo, 2013-2014. He joined the Department of Mathematics at the University of Washington as an Assistant Professor in 2014. Dmitriy’s research broadly focuses on designing and analyzing algorithms for large-scale nonsmooth optimization problems, often motivated by applications in data science. Dmitriy has received a number of awards, including a finalist citation for the Tucker Prize (2015), Air Force Office of Scientific Research (AFOSR) Young Investigator Program (YIP) Award, and NSF CAREER. Dmitriy is currently a co-PI of a new NSF funded Transdisciplinary Research in Principles of Data Science (TRIPODS) institute at University of Washington.