Welcome back to my series _Neural Networks Intuitions. _In this ninth segment, we will be looking into deep distance metric learning, the motivation behind using it, wide range of methods proposed and its applications.

Note: All techniques discussed in this article comes under Deep Metric Learning (DML) i.e distance metric learning using neural networks.


Distance Metric Learning:

Distance Metric Learning means learning a distance in a low dimensional space which is consistent with the notion of semantic similarity. (as given in [No Fuss Distance Metric Learning using Proxies])

What does the above statement mean w.r.t image domain?

It means learning a distance in a low dimensional space(non-input space) such that similar images in the input space result in similar representation(low distance) and dissimilar images result in varied representation(high distance).

Okay, this sounds exactly what a classifier does. Isn’t it? Yes.

So how is this different from supervised image classification? Why different terminology?

Metric learning addresses the problem of open-set setup in machine learning i.e generalize to new examples at test time.

This is not possible by a feature-extractor followed by fully connected layer Classification network.

Why?

This is a very important question. The answer is as follows:

  1. A classifier learns**class-specific features and not necessarily generic features.**
  2. A classifier with a standard cross entropy loss maximizes inter-class distances such that the features before FC layer are linearly separable.

#metric-learning #deep-learning #siamese-networks #triplet-loss #representation-learning #deep learning

Neural Networks Intuitions: 9. Distance Metric Learning
1.65 GEEK