The field of Machine Learning is huge. You can easily be overwhelmed by the amount of information out there. To not get lost, the following list helps you estimate where you are. It provides an outline of the vast Deep Learning space and does not emphasize certain resources. Where appropriate, I have included clues to help you orientate.

An excerpt of the list, by the author. The list is available at GitHub here and on Notion here.

Since the list has gotten rather long, I have included an excerpt above; the full list is at the bottom of this post.

Entry level

The entry-level is split into 5 categories:

  • Data handling introduces you to small datasets
  • Classic Machine Learning covers key concepts of classic Machine Learning techniques
  • Networks covers the classic DNNs, CNNs, and RNNs
  • Theory lists the concepts behind the above categories
  • General lists the main things you encounter at this stage

Data handling

At the entry level, the datasets used are small. Often, they easily fit into the main memory. If they don’t already come pre-processed then it’s only a few lines of code to apply such operations. Mainly you’ll do so for the major domains AudioImageTime-series, and Text.

Classic Machine Learning

Before diving into the large field of Deep Learning it’s a good choice to study the basic techniques. These include regression, clustering, and SVMs. Of the listed algorithms, only the SVM might be a bit more tricky. Don’t let yourself be overwhelmed by this: Give it a try, and then move on.

Networks

No Deep Learning without its most important ingredient: Neural networks in all variants; GANs, AEs, VAEs, Transformers, DNNs, CNNs, RNNs, and many more. But there’s no need to cover everything yet. At this stage, it’s sufficient to have a look at the last three keywords.

Theory

No Deep Learning without neural networks, and no neural networks without (mathematical) theory. You can begin the learning by getting to know mathematical notation. A bit scary at first, you’ll soon begin to embrace the concise brevity. Once you grasp it, look at matrix operations, a central concept behind neural networks.

And with this knowledge, you can then proceed to the convolution operations, another central concept.

Put simply, you move a matrix over another matrix and calculate the inner product between the overlapping areas. There are many variants — keep learning and you’ll naturally use them, too!

#data-science #deep-learning #machine-learning

A guide to the field of Deep Learning
1.10 GEEK