Hyperparameters are the parameters in models that determine model architecture, learning speed and scope, and regularization.

The search for optimal hyperparameters requires some expertise and patience, and you’ll often find people using exhausting methods like grid search and random search to find the hyperparameters that work best for their problem.

A quick tutorial

I’m going to show you how to implement Bayesian optimization to automatically find the optimal hyperparameter set for your neural network in PyTorch using Ax.

We’ll be building a simple CIFAR-10 classifier using transfer learning. Most of this code is from the official PyTorch beginner tutorial for a CIFAR-10 classifier.

I won’t be going into the details of Bayesian optimization, but you can study the algorithm on the Ax website, read the original paper or the 2012 paper on its practical use.

#machine-learning #pytorch #data-science #computer-vision #deep-learning

Quick Tutorial: Using Bayesian optimization to tune your hyperparameters in PyTorch
3.15 GEEK