Learn all the basics you need to get started with this deep learning framework! In this part we learn about activation functions in neural nets. What are activation functions, why are they needed, and how do we apply them in PyTorch.

I go over following activation functions:

  • Binary Step
  • Sigmoid
  • TanH (Hyperbolic Tangent)
  • ReLU
  • Leaky ReLU
  • Softmax

Official website:

Code for this tutorial series:

#python #pytorch #deep-learning #programming #developer

PyTorch Tutorial - Activation Functions
2.05 GEEK