Back

Speaker "Scott Clark" Details Back

 

Topic

Using Bayesian Optimization to Tune Machine Learning Models

Abstract

We introduce Bayesian Global Optimization, the core technology behind SigOpt, as an efficient way to optimize machine learning model parameters, especially when evaluating different parameters is time-consuming or expensive. The techniques described can be used to help tackle any number of machine learning optimization problems, like tuning a deep learning model, classification algorithm, or any other machine learning system. In this talk we will build motivation, explain the underlying techniques, and give examples and comparisons to other standard methods.

Profile

Scott has been applying optimal learning techniques in industry and academia for years, from bioinformatics to production advertising systems. Before SigOpt, Scott worked on the Ad Targeting team at Yelp leading the charge on academic research and outreach with projects like the Yelp Dataset Challenge and open sourcing MOE. Scott holds a PhD in Applied Mathematics and an MS in Computer Science from Cornell University and BS degrees in Mathematics, Physics, and Computational Physics from Oregon State University. Scott was chosen as one of Forbes' 30 under 30 in 2016.