Joey Huchette
Joey Huchette is a Software Engineer working on the Large Scale Optimization team. He received his PhD in Operations Research from MIT, advised by Juan Pablo Vielma. His work has been honored by the Beale — Orchard-Hays Prize, the INFORMS Computing Society Prize, and the COIN-OR Cup.
Research Areas
Authored Publications
Sort By
Strong mixed-integer programming formulations for trained neural networks
Will Ma
Mathematical Programming, 183 (2020), pp. 3-39
Preview abstract
We present strong mixed-integer programming (MIP) formulations for high-dimensional piecewise linear functions that correspond to trained neural networks. These formulations can be used for a number of important tasks, such as verifying that an image classification network is robust to adversarial inputs, or solving decision problems where the objective function is a machine learning model. We present a generic framework, which may be of independent interest, that provides a way to construct sharp or ideal formulations for the maximum of d affine functions over arbitrary polyhedral input domains. We apply this result to derive MIP formulations for a number of the most popular nonlinear operations (e.g. ReLU and max pooling) that are strictly stronger than other approaches from the literature. We corroborate this computationally, showing that our formulations are able to offer substantial improvements in solve time on verification tasks for image classification networks.
View details
Strong mixed-integer programming formulations for trained neural networks
Integer Programming and Combinatorial Optimization - 20th International Conference, IPCO 2019, Ann Arbor, MI, USA, May 22-24, 2019, Proceedings, Springer, pp. 27-42
Preview abstract
We present an ideal mixed-integer programming (MIP) formulation for a rectified linear unit (ReLU) appearing in a trained neural network. Our formulation requires a single binary variable and no additional continuous variables beyond the input and output variables of the ReLU. We contrast it with an ideal “extended” formulation with a linear number of additional continuous variables, derived through standard techniques. An apparent drawback of our formulation is that it requires an exponential number of inequality constraints, but we provide a routine to separate the inequalities in linear time. We also prove that these exponentially-many constraints are facet-defining under mild conditions. Finally, we study network verification problems and observe that dynamically separating from the exponential inequalities (1) is much more computationally efficient and scalable than the extended formulation, (2) decreases the solve time of a state-of-the-art MIP solver by a factor of 7 on smaller instances, and (3) nearly matches the dual bounds of a state-of-the-art MIP solver on harder instances, after just a few rounds of separation and in orders of magnitude less time.
View details
Preview abstract
We give an explicit geometric way to build ideal mixed-integer programming (MIP) formulations for unions of polyhedra. The construction is simply described in terms of the normal directions of all hyperplanes spanned (in a r-dimensional linear space) by a set of directions determined by the extreme points shared among the alternatives. The resulting MIP formulation is ideal, and has exactly r integer variables and 2 x (# of spanning hyperplanes) general inequality constraints. We use this result to derive novel logarithmic-sized ideal MIP formulations for discontinuous piecewise linear functions and the annulus constraint arising in robotics and power systems problems.
View details