This lecture is dedicated for variations of gradient descent algorithms. We talk about Stochastic & mini-batch Gradient Descent along with Simulated Annealing. Python implementations are done on Jupyter.

⏲Outline⏲
00:00​ Introduction
00:26​ Simulated Annealing
02:42​ Stochastic Gradient Descent with variable step size
10:12​ Mini-batch Gradient Descent

Subscribe: https://www.youtube.com/channel/UCgC1d4JZ1Fz4t8MWLJD464w

#machine-learning #tensorflow #scikit-learn

Simulated Annealing x SGD x Mini-batch | Machine Learning w TensorFlow & scikit-learn
5.85 GEEK