Hyperparameters control the learning process of a model by determining the network structure and training. Hyperparameter tuning is the process of choosing an optimal set of hyperparameters where model performance is the best and error rate is least.

Different models have different learning methods and therefore have different hyperparameters. For example, Random Forest Regression can be tuned by observing the model performance compared against the number of trees in the forest.

Neural Networks’ training can be controlled by controlling these hyperparameters: number of hidden layers, dropout value, optimizer type, etc.

SVM algorithms need the type of kernel to be specified.

The XGBoost model can be tuned by controlling its many hyperparameters: learning rate(eta), evaluation metric, regularization type, etc.

It is difficult to manually change hyperparameters and fit on the training data every time. In such cases, Grid Search is used. Grid Search defines a search space as a grid of hyperparameter values and evaluates each position in the grid. It checks all possible combinations of all hyperparameter values to find the set that gives the best performance.

Sklearn provides grid search implementation with the GridSearchCV library.

- Import GridSearchCV library

from sklearn.model_selection import GridSearchCV

- Import Classifier whose hyperparameters are to be tuned.

from sklearn.svm import SVC

- Set parameters grid space

parameters = [{'kernel': ['linear', 'poly', 'rbf', 'sigmoid'],'C':[1,2,3,300,],'max_iter': [500,1000]}]

- Instantiate GridSearchCV and pass parameters

clf = GridSearchCV(SVC(), parameters, scoring='accuracy')

clf.fit(X_train, y_train)

- Print the best parameters

print(clf.best_params_)

Hyperparameter tuning is an extremely useful skill that if applied properly gives the best possible model for the data. However, in-depth knowledge of the model to be tuned and the implications of changing hyperparameter values is crucial.