In this article, we will only focus on the Better Optimizing algorithm for Deep Neural Network (DNN). We will call this optimizing algorithm as a Learning algorithm for this article.

In a world where Deep Learning is dominating everywhere, from Agriculture to Medical Science, Automobile, Education, Defense, Security, and other fields. The algorithm has to be efficient for Neural Networks to get better results. Optimization techniques become the centerpiece of deep learning algorithms when one expects better and faster results from the neural networks, and the choice between these optimization algorithms techniques can make a huge difference between waiting for hours or days for excellent accuracy. There is some main point about Optimization in Neural Network.

- Better Optimization Algorithm
- Better Activation Function
- Better Initialization Method
- Better Regularization

In this article, we will only focus on the Better Optimizing algorithm for Deep Neural Network (DNN). We will call this optimizing algorithm as a Learning algorithm for this article. There are several well-known Learning algorithms out there. let’s have a look at them.

- Vanilla Gradient Descent (GD)
- Momentum Based Gradient Descent
- Nesterov Accelerated Gradient Descent (NAG)

- Stochastic Update
- Mini-Batch Update

- AdaGrad
- RMS Prop
- Adam (a mixture of RMS Prop and Momentum based GD)

machine-learning optimization-algorithms learning-algorithms deep-learning neural-networks

Various Optimization Algorithms For Training Neural Network: The right optimization algorithm can reduce training time exponentially.

On Optimization of Deep Neural Networks: A random-walk to classical developments that made deep learning the ultimate learning machines.

Optimizers are algorithms or methods used to change the attributes of the neural network such as weights and learning rates in order to reduce the losses.

The past few decades have witnessed a massive boom in the penetration as well as the power of computation, and amidst this information.

“You do not really understand something unless you can explain it to your grandmother” Not sure where this quote originally came from, it is sometimes kind of half-attributed to Albert Einstein.