When constructing a neural network, there are several optimizers available in the Keras API in order to do so.

An optimizer is used to minimise the loss of a network by appropriately modifying the weights and learning rate.

For regression-based problems (where the response variable is in numerical format), the most frequently encountered optimizer is the **Adam **optimizer, which uses a stochastic gradient descent method that estimates first-order and second-order moments.

The available optimizers in the Keras API are as follows:

  • SGD
  • RMSprop
  • Adam
  • Adadelta
  • Adagrad
  • Adamax
  • Nadam
  • Ftrl

The purpose of choosing the most suitable optimiser is not necessarily to achieve the highest accuracy per se — but rather to minimise the training required by the neural network to achieve a given level of accuracy. After all, it is much more efficient if a neural network can be trained to achieve a certain level of accuracy after 10 epochs than after 50, for instance.

#machine-learning #neural-network-algorithm #data-science #keras #tensorflow #neural networks

Neural Networks: Importance of Optimizer Selection
1.40 GEEK