This write-up re-introduces the concept of entropy from different perspectives with a focus on its importance in machine learning, probabilistic programming, and information theory.

Here is how it is defined by the dictionaries as per a quick google search -

Image for post

Source: Screenshot of Google Search

Based on this result, you can notice that there are two core ideas here and at first, the correlation between them does not seem to be quite obvious -

  • Entropy is the missing (or required) energy to do work as per thermodynamics
  • Entropy is a measure of disorder or randomness (uncertainty)

So what is it — missing energy, or a measure, or both? Let me provide some perspectives that hopefully would help you come to peace with these definitions.

Shit Happens!

Rephrasing this obnoxious title into something bit more acceptable

Anything that can go wrong, will go wrong — Murphy’s Law

We have all accepted this law because we observe and experience this all the time and the culprit behind this is none other than the topic of this writeup — yup, you got it, it’s Entropy!

So now I have confused you more — entropy is not only the missing energy and the measure of disorder but it is also responsible for the disorder. Great!

We can not make up our minds here as far as the definition is concerned. However, the truth is all of the above mentioned 3 perspectives are correct given the appropriate context. To understand these contexts let’s first check out disorder and its relation with entropy.

Disorder is the dominating force

I explain this with the help of examples from an article by James Clear (Author of Atomic Habits).

Image for post

Source: Left Image (https://pixabay.com/illustrations/puzzle-puzzle-piece-puzzles-3303412/) Right Image (Photo by James Lee on Unsplash) + annotated by Author

Theoretically, both of these are possible but the odds of them happening are astronomically small. Ok, fine, call it impossible 🤐 !. The main message here is the following:

There are always far more disorderly variations than orderly ones!

and borrowing the wisdom of great Steven Pinker -:

Image for post

#kl-divergence #machine-learning #entropy #intuition #tensorflow-probability #tensorflow

But what is Entropy?
1.25 GEEK