Jump to Content

Shape Constraints for Set Functions

Andrew Cotter
Maya R. Gupta
Heinrich Jiang
Erez Louidor
Jim Muller
Taman Narayan
Tao Zhu
International Conference on Machine Learning (2019) (to appear)
Google Scholar

Abstract

Set functions predict a label from a permutation-invariant variable-size collection of feature vectors. We propose making set functions more understandable and regularized by capturing domain knowledge through shape constraints. We show how prior work in monotonic constraints can be adapted to set functions. Then we propose two new shape constraints designed to generalize the conditioning role of weights in a weighted mean. We show how one can train standard functions and set functions that satisfy these shape constraints with a deep lattice network. We propose a non-linear estimation strategy we call the semantic feature engine that uses set functions with the proposed shape constraints to estimate labels for compound sparse categorical features. Experiments on real-world data show the achieved accuracy is similar to deep sets or deep neural networks, but provides guarantees of the model behavior and is thus easier to explain and debug.