Back

Speaker "Seungil You" Details Back

 

Topic

The Power Of Monotonicity: TensorFlow Lattice

Abstract

After a brief introduction to core TensorFlow concepts, we'll focus on the TensorFlow Lattice package, which empowers you to train more interpretable machine-learned models without sacrificing accuracy. TF Lattice enables you to input your prior information about global trends into your models, such as that closer coffee shops are better (if all other features are the same). Such global trends may be missed by flexible models like RF's and DNN's when trained on noisy data, and may only become problems when you run your model on examples that are different than your training data (data shift). By learning your preferred global trends, TF Lattice produces models that generalize better, and that you can explain and debug more easily, because you know what the model is doing. We'll show you how to use TF Lattice's pre-built TF Estimators, and how to use the underlying TF operators to build your own deep lattice network models or plug-n-play with other TF models including DNN's. Suitable for TF newbies and advanced TF users.

Profile

Seungil You is currently working in Google research and improving and maintaining a open source package, TensorFlow Lattice. Seungil You received his PhD in Control and Dynamical Systems from Caltech and B.S in Electrical Engineering from Seoul National University. His main research interests include mathematical optimization and its application to machine learning and power systems.