Different Optimization Algorithm for Deep Neural Networks: Complete Guide

In this article, we will only focus on the Better Optimizing algorithm for Deep Neural Network (DNN). We will call this optimizing algorithm as a Learning algorithm for this article.

Approximating maxima of an N-dimensional function using dimension addition

In this article, I would like to show a method for approximating global minima or maxima in an N-dimensional space that we used in a recommender system.

Optimization in Machine Learning 

No matter what kind of Machine Learning model you’re working on, you need to optimize it, and in this blog, we’ll learn how exactly optimization works.

Complete Guide to Adam Optimization

In the 1940s, mathematical programming was synonymous with optimization. An optimization problem included an objective function that is to be maximized or minimized by choosing input values from an allowed set of values [1]. Nowadays, optimization is a very familiar term in AI. Specifically, in Deep Learning problems

Optimization Techniques: Genetic Algorithm

An adaptive and well known optimization technique. In complex machine learning models, the performance usually depends on multiple input parameters. In order to get the optimal model, the parameters must be properly tuned.

Activation Functions, Optimization Techniques, and Loss Functions

Activation Functions, Optimization Techniques, and Loss Functions: A significant piece of a neural system Activation function is numerical conditions that decide the yield of a neural system.

A Survey on Shuffled Frog-Leaping Algorithm

The Shuffled Frog Leaping Algorithm (SFLA) is one of the most innovative optimization algorithms inspired by the social behavior of frogs in nature

Momentum ,RMSprop And Adam Optimizer

Optimizer is a technique that we use to minimize the loss or increase the accuracy. We do that by finding the local minima of the cost function.

Efficient Hyperparameter Optimization for XGBoost model Using Optuna

Optuna — A hyperparameter optimization framework. Hyperparameter optimization is the science of tuning or choosing the best set of hyperparameters for a learning algorithm.

Artificial Electric Field Algorithm for Optimization

Artificial Electric Field Algorithm in short ‘AEFA’ is an artificially intelligent optimization algorithm for mathematical optimization.

Validate the Correctness of an Evolutionary Optimization Algorithm

There are different ways to check the correctness and accuracy of an implemented metaheuristic multi-objective optimization algorithm.

Artificial Bee Colony Algorithm

Topic under Swarm Intelligence