The activation function defines the output of a neuron / node given an input or set of input (output of multiple neurons). It’s the mimic of the stimulation of a biological neuron.

The output of the activation function to the next layer (in shallow neural network: input layer and output layer, and in deep network to the next hidden layer) is called forward propagation (information propagation). It’s considered as a non linearity transformation of a neural network.

_A notebook with all the code are available here: _GitHub

There is a list of activation functions commonly used:

  • Binary
  • Linear
  • Sigmoid
  • Tanh
  • ReLU
  • Leaky ReLU (LReLU)
  • Parametric ReLU (PReLU)
  • Exponential Linear Unit (eLU)
  • ReLU-6
  • Softplus
  • Softsign
  • Softmax
  • Swish

Binary

The binary activation function is the simpliest. It’s based on binary classifier, the output is 0 if values are negatives else 1. See this activation function as a threshold in binary classification.

The code for a binary activation function is:

def binary_active_function(x):
    return 0 if x < 0 else 1

What is the output of this function ?

for i in [-5, -3, -1, 0, 2, 5]:
    print(binary_active_function(i))

output:
    0
    0
    0
    1
    1
    1

#machine-learning #neural-networks #deep-learning #activation-functions #data-science

What is activation function ?
1.35 GEEK