In this video we build on last week Multilayer perceptrons to allow for more flexibility in the architecture!

Making sure a flexible neural network architecture API isn’t too difficult. However, we need to be careful about the layer of abstraction we put in place in order to facilitate the work of the user who want to simply fit and predict. Here we make use of the following three concept: Network, Layer and Neuron. These three components will be composed together to make a fully connected feedforward neural network neural network.

For those who don’t know a fully connected feedforward neural network is defined as follows (From Wikipedia):
“A feedforward neural network is an artificial neural network wherein connections between the nodes do not form a cycle. As such, it is different from its descendant: recurrent neural networks.
The feedforward neural network was the first and simplest type of artificial neural network devised. In this network, the information moves in only one direction, forward, from the input nodes, through the hidden nodes (if any) and to the output nodes. There are no cycles or loops in the network.”

Github: https://github.com/yacineMahdid/artificial-intelligence-and-machine-learning/blob/master/Deep Learning from Scratch in Python/multi_layer_perceptron.ipynb

#python #deep-learning #machine-learning

Deep Neural Network from Scratch in Python
43.45 GEEK