Introduction

In Machine Learning, feature scaling is very important and a dime a dozen because it makes sure that the features of the data-set are measured on the same scale. The concept of feature scaling has come to the fore from statistics. It is an approach to plonk different variables on the same scale. It is commonly used when data-set has varying scales. Sometimes the features of data-set use to have exhaustive and large differences between their ranges. So, in this case, standardization overawes on the data-set to bring all on the same scale.

This concept is extensively used in SVM, Logistic Regression, and Neural Network.

The concept of Standardization (Z-Score Normalization) is completely based on the mathematical concepts called Standard Derivation and Variance.

Variance

The variance is the average of the squared difference from the mean.

Below are steps to derive variance:

  • Calculate the mean
  • Subtract the value of mean from each number
  • Square the subtracted result

#statistics #data-science #artificial-intelligence #machine-learning #mathematics

Machine Learning Standardization with Mathematics
1.35 GEEK