TLDR, models always take the route of least effort.

Training machine learning models is far from easy. In fact, the unaware data scientist might trip and fall in as many pitfalls as there are living AWS instances. The list is endless but divides itself nicely into two broad categories: underfitting, your model is bad, and overfitting, your model is still bad, but you think it isn’t. While overfitting can manifest itself in various ways, shortcut learning is a recurring flavor when dealing with custom datasets and novel problems. It affected me; it might be affecting you.

Informally, shortcut learning occurs whenever a model fits a problem on data not expected to be relevant or present, in general.

#ai & machine learning #algorithms #machine learning #ml models #shortcut learning

Shortcut Learning: The Reason ML Models Often Fail in Practice
1.25 GEEK