Note: This video may require joining the NVIDIA Developer Program or login

GTC Silicon Valley-2019 ID:S9372:Automatic Model Tuning in Practice Using Bayesian Hyperparameter Tuning

Cyrus Vahid(Amazon Web Services)
Tuning hyperparameters is a time-consuming and costly task. More an art than a science, it often takes long hours to arrive at a good combination of parameters such as batch size, learning rate, optimizer, number of layers, number of nodes in a layer, and potentially tens of others. We'll discuss how automating the process of finding the best combination of parameters, based on a data-centric and repeatable method, can save time and result in better models. We will explain the theory of Bayesian hyperparameter optimization and provide hands-on labs to help attendees learn how to take advantage of Amazon SageMaker's Automatic Model Tuning.

View the slides (pdf)