Polynomial regression in an improved version of linear regression. If you know linear regression, it will be simple for you. If not, I will explain the formulas here in this article. There are other advanced and more efficient machine learning algorithms are out there. But it is a good idea to learn linear based regression techniques. Because they are simple, fast, and works with very well known formulas. Though it may not work with a complex set of data.

Polynomial Regression Formula

Linear regression can perform well only if there is a linear correlation between the input variables and the output variable. As I mentioned before polynomial regression is built on linear regression. If you need a refresher on linear regression, here is the link to linear regression:

Linear Regression Algorithm in Python

Learn the concepts of linear regression and develop a complete linear regression algorithm from scratch in python

towardsdatascience.com

Polynomial regression can find the relationship between input features and the output variable in a better way even if the relationship is not linear. It uses the same formula as the linear regression:

Y = BX + C

I am sure, we all learned this formula in school. For linear regression, we use symbols like this:

Image for post

Here, we get X and Y from the dataset. X is the input feature and Y is the output variable. Theta values are initialized randomly.

For polynomial regression, the formula becomes like this:

Image for post

We are adding more terms here. We are using the same input features and taking different exponentials to make more features. That way, our algorithm will be able to learn about the data better.

#data-science #polynomial-regression #python #towards-data-science #machine-learning #data analytic

Polynomial Regression From Scratch in Python
14.90 GEEK