Artificial intelligence is an old myth. Most of its technology is from the 1950s. And about every 20 years, we get excited about it for a few years, and then we get interested in something else. These are called “AI winters”.

Right now it would be more like summer for AI and I would wonder if winter is on the horizon.

In our daily life, we mainly encounter two branches of artificial intelligence (AI). The first is Machine Learning. An application of artificial intelligence that provides systems the ability to automatically learn and improve from experience without being explicitly programmed.

Let us imagine that we introduce into such a system daily, a set of data such as the temperature, the humidity, the wind of the day before, it’s basically a weather forecast model.

After providing a large number of examples to the system — the learning phase — there will come a point whereby providing only the data for the day, it will estimate the weather for the next day with acceptable reliability.

This classification approach is extremely effective for recognizing cats in photos, anticipating a mechanical breakdown on a car, or knowing whether a product at the end of the production line is compliant or not.

We can reasonably think that it will quickly be very effective also in anticipating diseases.

The principle of “machine learning” is therefore to use habits, the past to anticipate the future. Such a system is therefore by nature **conformist **and totally incapable of predicting an event that is not in the continuity of those that have already happened. In other words no better than your favorite fortune teller.

#artificial-intelligence #social-change #data-science #tech #technology

Why Artificial Intelligence Is NOT That Intelligent
1.20 GEEK