How to Build Neural Network from Scratch

In this post, we will build our own neural network from scratch with one hidden layer and a sigmoid activation function. We will take a closer look at the derivatives and a chain rule to have a clear picture of the backpropagation implementation. Our network would be able to solve a linear regression task with the same accuracy as a Keras analog. Step by step tutorial on how to building a neural network from scratch

A Complete Guide to Linear Regression for Beginners

A Complete Guide to Linear Regression for Beginners. Linear Regression is the most simple, easily understandable, and widely used supervised regression model. In this blog post, I will be discussing simple linear regression.

Gradient Descent With Momentum

Gradient Descent With Momentum. This post explores how many of the most popular gradient-based optimization algorithms such as Momentum, Adagrad, and Adam actually work.

Gradient Descent With Momentum

This post explores how many of the most popular gradient-based optimization algorithms such as Momentum, Adagrad, and Adam actually work.

Adaptive Learning Rate: AdaGrad and RMSprop

Adaptive Learning Rate methodologies like AdaGrad and RMSprop, we let these optimizer tune the learning rate by learning the characteristics of the underlying data. These optimizers give frequently occurring features low learning rates and infrequent features high learning rates thus converging faster.

The Math behind Gradient Descent and the Normal Equation for Linear Regression

When i started my machine learning journey, the math was something that always intrigued me and still does. In this article, I will be going over the math behind Gradient Descent and the derivation behind the Normal linear Equation and then implementing them both on a dataset to get my coefficients. I for one believe that…

Exploring Optimizers in Machine Learning

A guide to the widely used Optimizer functions, and a breakdown of their benefits and limitations. In this post we’re going to embark on a journey to explore and dive deep into the world of optimizers for machine learning models.

Analysis of Learning Rate in Gradient Descent Algorithm Using Python

Analysis of Learning Rate in Gradient Descent Algorithm Using Python. In this tutorial, you’ll learn, implement, visualize the Performance of Gradient descent by trying different sets of learning rate values.

Machine Learning Basics

The advancements and opportunities that Machine Learning has evoked are limitless in value. Before answering the question, ‘What is Machine Learning?’, I would highlight its importance from the perspective of ‘Why Machine Learning?’.

Understanding a Neural Network through scratch coding in R; A novice guide

I will try to slightly unmask this black box and we will take a gentle dive into the mathematical formalism of Neural Network incorporating gradient descent optimizer and comprehensively try to build our own.

Logistic Regression With Gradient Descent in Excel

So you can better understand how Logistic Regression works. In this article, I will share how I implemented a simple Logistic Regression with Gradient Descent.

What is Gradient Descent?

This tutorial is on the basics of gradient descent. It is also a continuation of the Intro to Machine Learning post, “What is Machine Learning?”, which can be found here.

Linear Regression From Scratch in Excel

In this article, I will share how I implemented a simple Linear Regression with Gradient Descent. You can use this link Simple linear regression with gradient descent to get the Excel/Google Sheet file.

Why convexity is the key to optimization

It is easy with convex cost functions. The most interesting thing you would first come across when starting out with machine learning is the optimization algorithm and to be specific, it is the gradient descent, which is a first-order iterative optimization algorithm used to minimize the cost function.

Introduction to Gradient Descent with linear regression example using PyTorch

In this post, I will discuss the gradient descent method with some examples including linear regression using PyTorch.

Which Optimizer Should I Use in my Machine Learning Project?

This article provides a summary of popular optimizers used in computer vision, natural language processing, and machine learning in general. Additionally, you will find a guideline based on three questions to help you pick the right optimizer for your next machine learning project.

BYOL: Bring Your Own Loss

How we improve delivery time estimation with a custom loss function.Dear connoisseurs, I invite you to take a look inside Careem’s food delivery platform. Specifically, we are going to look at how we use machine learning to improve the customer experience for delivery time tracking.

Understanding Gradient Descent

Let’s reach the global minimum. Optimization algorithms are algorithms that are designed to converge to a solution. The solution here could be a local minimum or the global minimum by minimizing a cost function say ‘L’.

An Introduction to Gradient Descent

What is Gradient Descent? Suppose you’re blindfolded in the mountains, and your goal is to reach the bottom of the valley swiftly.

Strengths and Weaknesses of Optimization Algorithms Used for ML

Strengths and Weaknesses of Optimization Algorithms Used for ML. A deep-dive into Gradient Descent and other optimization algorithms