In this article, we’ll look at overfitting, and what are some of the ways to avoid overfitting your model. There is one sole aim for machine learning models – to generalize well.

The efficiency of both the model and the program as a whole depends strongly on the model’s generalization. It serves its function if the model generalizes well. Building on that idea, terms such as overfitting and underfitting apply to flaws that could suffer from the success of the model.

Overfitting – Defining and Visualizing

After training for a certain threshold number of epochs, the accuracy of our model on the validation data would peak and would either stagnate or continue to decrease.

Instead of generalized patterns from the training data, the model instead tries to fit the data itself. Therefore, fluctuations that are specific to the training data are learned, along with outlier information.

Hence for **regression, **instead of a smooth curve through the center of the data that minimizes the error like this:

Not overfitting with data

#machine learning

Overfitting - What is it and How to Avoid Overfitting a model?
4.05 GEEK