Tuesday, March 6, 2018 at 4:15pm
Mixed-integer programs (MIPs) can be very useful in decision-making by allowing us to represent discrete choices and structures. These can already be difficult to solve to optimality for large-scale problems even when the objective and feasible region are convex. For many applications, we want to be able to go beyond this and model concepts that naturally lead to extremely large or nonconvex formulations. These include robustness to uncertainty, economic ideas such as diminishing returns, and physical concepts such as AC power flow.
I will present two computationally-efficient approaches—one drawn from discrete optimization and one from continuous optimization—for handling two different families of MIPs. First, I will present a new technique for deriving cutting planes for MIPs with separable concave costs. This can be combined with existing cuts for canonical mixed-integer linear sets to quickly solve planning problems with economies of scale. For stochastic MIPs, I will describe a new sub gradient method for solving the dual decomposition that parallelizes significantly better than the traditional sub gradient method on distributed and multi-core computer architectures. I will conclude by discussing some future directions in machine learning and (stochastic) mixed-integer programming.
Cong Han Lim is a postdoctoral researcher in the Wisconsin Institute for Discovery at the University of Wisconsin-Madison. Prior to that, he received his B.Sc. in mathematics and computer science from the University of Chicago and his M.S./Ph.D. in computer sciences from the University of Wisconsin-Madison. He spent the fall 2017 semester as a research fellow at the Simons Institute for the Theory of Computing. Cong Han studies discrete optimization problems arising in machine learning and operations research. His work aims to expand the classes of problems that can be solved in practice by leveraging techniques from both continuous and discrete optimization.