## Graphing The SIR Model With Python

Graphing and solving simultaneous differential equations to model COVID-19 spread

## Understanding Error Backpropagation

In this article, I will attempt to explain how this algorithm works, and then build a simple neural network from scratch to test this network on a regression problem I used in my previous post.

## Unfolding Logistic Regression

Unfolding Logistic Regression. Don’t get confused with the name as it says regression but Logistic Regression is a supervised learning algorithm which is used for carrying out classification tasks.

## Multi-Layer Perceptron & Backpropagation — Implemented from scratch

In this post, we are going to re-play the classic Multi-Layer Perceptron. Most importantly, we will play the solo called backpropagation, which is, indeed, one of the machine-learning standards.

## How to Calculate the Fibonacci Sequence in Logarithmic Time

How to Calculate the Fibonacci Sequence in Logarithmic Time. The naïve solution to the famous Fibonacci problem is exponential and most people can solve it in constant time. But can we do better?

## Waiting Line Models

A Full Guide to Waiting Line Models and Queuing Theory. In this article, I will give a detailed overview of waiting line models. I will discuss when and how to use waiting line models from a business standpoint.

## Hacking Through Mathematics

Same guide for hacking in statistics. Here are the details about how I managed to hack through mathematics. Denis Sheeran takes on the problems created through using many of the traditional math lesson routines and provides the reader with easy to implement ...

## Bayesian Updating and the “Picture Becomes Clearer” Analogy

The first two figures show what can happen — they are extreme cases, and other cases may incorporate both small continuous changes and big jumps. The third figure below indicates what “the picture becomes clearer” analogy would suggest — that probabilities should change only by moving closer to the ultimate outcome.

## Matrices in Data Science Are Always Real and Symmetric

In this article, we will consider three examples of real and symmetric matrix models that we often encounter in data science and machine learning, namely, the regression matrix (R); the covariance matrix, and the linear discriminant analysis matrix (L).

## A crash course on floating point numbers in a computer

Unlike integers, the floating point number system in a computer doesn’t use two’s complement to represent negative numbers. What are the numeric data types in a computer programming language? That is one of the first things a beginner is taught. In most cases, the ...

## Why Linear Algebra Is Important For Programming

Error correcting codes: Another unseen but widespread use of linear algebra is in coding theory. The problem is to encode data in such a way that if the encoded data is tampered with a little bit, you can still recover the unencoded data.

## Explaining subset selection and regularization methods for linear-squares

We explained the most used linear regression machine learning technique, the least-squares. We explained distinct approaches to multiple linear regressions and regressions with multiple outputs.

## Do You Struggle With The Quantum Superposition?

Do You Struggle With The Quantum Superposition? In this post, we have a closer look at the quantum superposition and how its state determines whether we will measure the qubit as 0 or 1.

## The ins and outs of Gradient Descent

Gradient descent is an optimization algorithm used to minimize some cost function by iteratively moving in the direction of steepest descent. We will start by exploring some basic optimizers commonly used in classical machine learning and then move on to some of the more popular algorithms used in Neural Networks and Deep Learning.

## Achieving Loosely Coupling with a Math Expression Parser

Achieving Loosely Coupling with a Math Expression Parser. Using mxParser to separate the data calculation logic definition from where it is executed

## The Roadmap of Mathematics for Deep Learning

The Roadmap of Mathematics for Deep Learning. Understanding the inner workings of neural networks from the ground-up

## Computational Complexity of Neural Networks

Computational Complexity of Neural Networks. On why neural networks are so slow to train and how can we speed it up

## Matrix Multiplication Made Easy (In Java)

Dot products are an important operation when working with vectors (and are they key to making matrix multiplication so simple), so if you’re still not quite sure what you’re looking at, the “good stuff” in this post will get confusing fast.

## Minimum Numbers of Function Calls to Make Target Array With Python and Javascript

Minimum Numbers of Function Calls to Make Target Array With Python and Javascript. With Python and Javascript solutions: Solving a medium leetcode problem