Bayes’ Theorem

Bayes’ Theorem gives us the posterior probability of an event given given what is known as prior knowledge.

TKverbatim from Analytics Vidhyablog** [Prior probability** is nothing but the proportion of dependent (binary) variable in the data set. It is the closest guess you can make about a class, without any further information] or you can say how probable is A before observing B.

TKlargely verbatim from Analytics Vidhya blog [Likelihood is the probability of classifying a given observation as one kind in presence of some other variable.] In other words how probable is B when given that A is true or happened.

Marginal likelihood is, how probable is the new datapoint under all the possible variables.

Naive Bayes Classifier is a Supervised Machine Learning Algorithm. It is one of the simple yet effective algorithm. Naive Bayes algorithm classify the object or observation with the help of Bayes Theorem.

**TK verbatim from **naive bayes classifier in image processing [Naive Bayes is a classification technique based on an assumption of independence between predictors or what’s known as Bayes’ theorem.] TK verbatim from Analytics Vidhya blog [In simple terms, a Naive Bayes classifier assumes that the presence of a particular feature in a class is unrelated to the presence of any other feature.]

The most common question that comes in our mind by the name of this algorithm.

TK verbatim from Analytics Vidhya blog [Q. Why is naive Bayes so ‘naive’ ?

Naive Bayes is so ‘naive’ because it assumes that all of the features in a data set are equally important and independent.] It makes assumptions that are virtually impossible to see in real-life data.

#python #data-analysis #machine-learning #programming #data-science

What Is Naive Bayes?
2.40 GEEK