Introduction

The need to reduce the complexity of a model can arise from multiple factors, often to reduce the computational requirements. However, complexity can’t be arbitrarily reduced because, after many iterations of training and testing, that’s the one model that provided good results. Research on this topic is active, e.g. [Koning et al., 2019] propose a solution to this same problem for CNNs used for exoplanet detection:

Convolutional Neural Networks (CNNs) suffer from having too many trainable parameters, impacting computational performance … We propose and examine two methods for complexity reduction in AstroNet … The first method makes only a tactical reduction of layers in AstroNet while the second method also modifies the original input data by means of a Gaussian pyramid

The second method (modifying or transforming the input data) is common. According to Google’s Crash Course on Machine Learning, transformations are done primarily for two reasons:

  1. Mandatory transformations: it makes the data compatible with the algorithms, e.g. converting non-numeric features into numeric.
  2. Quality transformations: it helps the model perform better, e.g. normalizing numeric features.

The kind of transformation proposed by [Koning et al., 2019], and the one proposed in this article, fit in the second category.

#artificial-intelligence #multilayer-perceptron #artificial-neural-network #data-mining #machine-learning #data analysis

Reducing the Artificial Neural Network complexity by transforming your data
1.75 GEEK