In this video, we will talk about Gradient Descent and how we can use it to update the weights and bias of our AI model. We will learn how to minimize the average loss of our model, and get a warm introduction to "epochs" and "learning rate"!
We will of course also see a working example of the math behind Gradient Descent, and learn how to implement it with code by using our superior Python skills! 🐍🐍🐍
Before we dive in, make sure you are proficient with the previous topics of my AI series - Perceptron, Weights, Input, Weighted Sum, Target, Prediction, Activation Function, Loss Function & Cross-Entropy Loss.
*****************************************
⭐ time stamps ⭐
*****************************************
00:00 - what is gradient descent?
00:37 - gradient descent vs perception
01:04 - sigmoid activation function
01:45 - bias and threshold
02:06 - weighted sum - working example
02:37 - sigmoid - working example
03:03 - loss function - working example
03:32 - how to update weights
04:17 - what is learn rate?
05:06 - how to update bias
05:37 - gradient descent - working example
07:13 - what is epoch?
07:38 - average loss per epoch
08:37 - gradient descent code example
12:13 - thank you for watching! stay in touch!

#python #machineleanring #ai 

Gradient Descent Explained!
6.75 GEEK