In this article, we will only focus on the Better Optimizing algorithm for Deep Neural Network (DNN). We will call this optimizing algorithm as a Learning algorithm for this article.

In this article, I would like to show a method for approximating global minima or maxima in an N-dimensional space that we used in a recommender system.

No matter what kind of Machine Learning model you’re working on, you need to optimize it, and in this blog, we’ll learn how exactly optimization works.

In the 1940s, mathematical programming was synonymous with optimization. An optimization problem included an objective function that is to be maximized or minimized by choosing input values from an allowed set of values [1]. Nowadays, optimization is a very familiar term in AI. Specifically, in Deep Learning problems

An adaptive and well known optimization technique. In complex machine learning models, the performance usually depends on multiple input parameters. In order to get the optimal model, the parameters must be properly tuned.

Activation Functions, Optimization Techniques, and Loss Functions: A significant piece of a neural system Activation function is numerical conditions that decide the yield of a neural system.

The Shuffled Frog Leaping Algorithm (SFLA) is one of the most innovative optimization algorithms inspired by the social behavior of frogs in nature

Optimizer is a technique that we use to minimize the loss or increase the accuracy. We do that by finding the local minima of the cost function.

Optuna — A hyperparameter optimization framework. Hyperparameter optimization is the science of tuning or choosing the best set of hyperparameters for a learning algorithm.

Artificial Electric Field Algorithm in short ‘AEFA’ is an artificially intelligent optimization algorithm for mathematical optimization.

There are different ways to check the correctness and accuracy of an implemented metaheuristic multi-objective optimization algorithm.

Topic under Swarm Intelligence