This post is about the ordinary least square method (OLS) for simple linear regression. If you are new to linear regression, read this article for getting a clear idea about the implementation of simple linear regression. This post will help you to understand how simple linear regression works step-by-step.

The simple linear regression is a model with a single regressor (independent variable) x that has a relationship with a response (dependent or target) y that is a

y = β0 + β1 x + ε — — — — — — — — — — (1)

Where β0: intercept

β1: slope (unknown constant)

ε: random error component

This is a line where y is the dependent variable we want to predict, x is the independent variable, and β0 and β1 are the coefficients that we need to estimate.

Estimation of β0 and β1 :

The OLS method is used to estimateβ0 and β1. The OLS method seeks to minimize the sum of the squared residuals. This means from the given data we calculate the distance from each data point to the regression line, square it, and the sum of all of the squared errors together.

Image for post

#linear-regression #towards-data-science #analytics-vidhya #regression-analysis #data-science

Ordinary Least Square (OLS) Method for Linear Regression
7.55 GEEK