In Computer vision we often deal with several tasks like Image classification, segmentation, and object detection. While building a deep learning model for image classification over a very large volume of the database of images we make use of transfer learning to save the training time and increase the performance of the model. Transfer learning is the process where we can use the pre-trained model weights of the model like VGG16, ResNet50, Inception, etc that were trained on the ImageNet dataset.

Also, when we train an image classification model we always want a model that does not get overfitted. Overfitted models are those models that perform good in training but poorly while prediction on testing data is computed. This is the reason we make use of regularization to avoid overfitting situations like Dropouts and Batch Normalization.

Through this article, we will explore the usage of dropouts with the Resnet pre-trained model. We will build two different models one without making use of dropout and one with dropout. At last, we will compare both the models with graphs and performance. For this experiment, we are going to use the CIFAR10 Dataset that is available in Keras and can also be found on Kaggle.


What we will learn from this article?

  • What is ResNet 50 Architecture?
  • How to use ResNet 50 for Transfer Learning?
  • How to build a model with and without dropout using ResNet?
  • Comparison of both the built models

What is ResNet50 Architecture?

ResNet was a model that was built for the ImageNet competition. This was the first model that was a very deep network having more than 100 layers. It reduced the error rate down to 3.57% from 7.32% shown by vgg. The main idea behind this network was making use of Residual connections. The idea is not to learn the original function but to residuals. Read more about ResNet architecture here and also check full Keras documentation.

PIN IT

Dropout

Dropout is a regularization technique for reducing over fitting in neural networks by preventing complex co-adaptations on training data. It is an efficient way of performing model averaging with neural networks. The term dilution refers to the thinning of the weights.The term dropout refers to randomly “dropping out”, or omitting, units (both hidden and visible) during the training process of a network.

Model Without Dropout

Now we will build the image classification model using ResNet without making use of dropouts. First, we will define all the required libraries and packages. Use the below code to import the same.

import tensorflow as tf
from keras import applications
from keras.utils import to_categorical
from keras.models import Sequential
from keras.layers import Conv2D, MaxPooling2D
from keras.layers import Dense, Dropout, Flatten
from keras import Model
from keras.applications.resnet50 import ResNet50
from keras.applications.resnet50 import preprocess_input
from tensorflow import keras

#developers corner #deep learning #dropout #resnet #resnet50 #transfer learning

guide-to-building-a-resnet-model-with-without-dropout
14.75 GEEK