This video explains why Rectified Linear Unit (ReLU) is required on CNN? i.e. it tells about the importance of ReLU Layer on CNN. This layer has become very popular over the years when it comes to including it in CNN model. With the help of a visual example, we are going to see how non-linearity can be introduced in an image and how it looks like when applying ReLU layer on top of convolution layer.

#data-science #machine-learning #programming #developer

Why Rectified Linear Unit (ReLU) is required in CNN? | ReLU Layer in CNN
2.20 GEEK