1.MACHINE LEARNING:-

  • In recent years, there was a special searching study related to machine learning with respect to both industry and academia(acedeme) and show its potential stability in large scale applications like data exploration , predictions and pattern recognition and analysis ,pattern computation, and predicting outcomes from collected data.
  • In this field of study, resources are important in active learning task that help in creating deeper understanding and provides different forms of data.
  • Where in small scale datasets, expert knowledge is acceptable to some extent for precise footnote and interpretation.
  • In large scale datasets, data analysis becomes a complicated task where accuracy and precise predictions matter a lot for unstructured or unlabeled data.This techniques solve problems by leveraging the posteriori knowledge learned from the big data.
  • With the volume of the increasing datasets, their analysis tends to generalize better but the annotations cost in terms of money and time, adding more, different mathematical and statistical methods are highly deployed for successful annotations._ The same tool is cross-entropy._

Image for post

Now, let’s learn about Cross-Entropy, its extensions (Loss Function and KL Divergence) and their part in respect to Machine Learning.

Cross-entropy is commonly used in machine learning as a loss function.

UNDERSTANDING THE CONCEPT OF CROSS- ENTROPY

In order to understand the concept of cross- entropy , let’s start with the definition of Entropy :-

Image for post

#kl-divergence #neural-networks #machine-learning #cross-entropy #loss-function

A Friendly  Introduction to Cross-Entropy for Machine Learning
1.35 GEEK