Every year, automakers are adding more advanced driver-assistance systems (ADAS) to their fleets. These include adaptive cruise control (ACC), forward collision warning (FCW), automatic parking, and more. One study found that ADAS could prevent up to 28% of all crashes in the United States. This technology will only improve, and will eventually develop into Level 5, fully autonomous cars.

For a car to completely drive itself, it needs to be able to understand its environment. This includes other vehicles, pedestrians, and road signs.

Road signs give us important information about the law, warn us about dangerous conditions, and guide us to our desired destination. If a car cannot distinguish the differences in symbols, colours, and shapes, many people could be seriously injured.

The way a car sees the road is different from how we perceive it. We can all tell the difference between road signs and various traffic situations instantly. When feeding images through to a computer, they just see ones and zeros. That means we need to teach the car to learn like humans, or at least identify signs like us.

To solve this problem, I tried building my own convolutional neural network (CNN) to classify traffic signs. In this process, there are three main steps: preprocessing imagesbuilding the convolutional neural network, and outputting a prediction.

#artificial-intelligence #machine-learning #cnn

Road Sign Classification: Learning to Build a CNN
2.60 GEEK