This article plans to _expand on _episode 4.1, explaining Gradient Descent and how it is used to minimise our cost function in Linear Regression. Knowledge of derivatives and partial derivatives will be helpful.

Linear Regression Recap

From the previous episode we calculated the regression line for our humidity and temperature data to be:

Which we obtained from the cost function graph shown below

The algorithm we use in order to obtain the parameter values that give this minimum cost is called gradient descent.

Overview

The idea of gradient descent is that we start at a random point on our cost function graph, for example here:

And use partial derivatives in order to obtain make our way down to the minimum.

We then look at what parameter values produce this minimum cost and use that in our regression line.

#data-science #machine-learning #ai #education

Understanding Gradient Descent
1.25 GEEK