[Invited Talk] Partial Monotonic Functions and Deep Lattice Networks in TensorFlow

Presenter: Seungil You, Google; Time: 3:00pm, Friday (2017/11/10); Location: Building 133 Room 204


Real-world machine learning applications may have requirements beyond accuracy, such as fast evaluation times and interpretability. In particular, guaranteed monotonicity of the learned function with respect to some of the inputs can be critical for user confidence. We then propose a parametrized deep models that are monotonic with respect to a user specified set of inputs by alternating layers of linear embeddings, ensembles of lattices, and calibrators (piecewise linear functions), with appropriate constraints for monotonicity, and jointly training the resulting network. We will cover the basics of calibrator and lattice models. A calibrator is interpretable calibrations of each feature to normalize continuous features and handle categorical or missing data. A lattice model is an interpolated lookup table, which can capture nonlinear feature interactions and easy to enforce partial monotonicity using a set of linear inequalities. To train this deep model, we implement the layers and projections for partial monotonicity as new computational graph nodes in TensorFlow and we will discuss how to use this model in TensorFlow, from using the Adam optimizer and batched stochastic gradients to train a model to TensorFlow Estimators implementation. Experiments on benchmark and real-world datasets show that six-layer monotonic deep lattice networks achieve state-of-the art performance for classification and regression with monotonicity guarantees.


Seungil You is working for a machine learning team in Google research. His main research interests include mathematical optimization and its application to machine learning, robustness optimization, power systems and communication systems. Seungil You received PhD in Control and Dynamical Systems from California Institute of Technology, and B.S. in Electrical engineering from Seoul National University. He is a reviewer of IEEE Transactions of Automatic Control, IEEE Transactions of Communication Systems, Conference on Decision and Control, American Control Conference, European Control Conference, Neural Information Processing Systems, SIGKDD, International Conferences on Machine Learning, International Conferences on Learning Representations.