An easy-to-follow journey through mainstream CNN variations and novelties

Convolutional Neural Networks: The building blocks

Convolutional Neural Networks, or just CNNs, is a commonly used shift-invariant method of extracting ‘learnable features’. CNNs have played a major role in the development and popularity of deep learning and neural networks. I have a separate blog which talks about various types of convolutional kernels and their advantages.

Types of Convolution Kernels : Simplified

An intuitive introduction to different variations of the glamorous CNN layer

towardsdatascience.com

However, instead of focusing on individual kernels, here I will focus on complete CNN architectures. We might not be able to visit every major development in the history of CNNs individually, but I will try to take you on the journey of how general CNN architectures have evolved over time. You will need some basic understanding of what CNNs are for the ride along.

Image for post

Convolution Neural Networks : An Overview [Source]

LeNet: Where it all started

LeNet was the first CNN architecture that used back-propagation to practical applications and suddenly deep learning was not just a theory anymore. LeNet was used for handwritten digit recognition and was able to outperform all other existing methods with ease. The LeNet architecture was quite simplistic, with just 5 layers composed of 55 convolutions and 22 max pooling, but paved the way to better and more complex models.

Image for post

LeNet architecture [1]

AlexNet: Deeper is better

AlexNet was one of the first CNN models implemented on GPUs that truly connected the growing computation power at that time with deep learning. They created a deeper and more complex CNN model that had kernels of various sizes (like 1111, 55, and 3*3) and significantly more number of channels than LeNet. They also started using ReLU activations instead of sigmoid or tanh, which helped train better models. AlexNet not only won the Imagenet classification challenge in 2012, but beat the runner up with margins that suddenly made non-neural models almost obsolete.

#convolutional-network #deep-learning #surveys #data-science #data analysis

From LeNet to EfficientNet: The evolution of CNNs
8.65 GEEK