On a lighter note, the embedding of a particular word (In Higher Dimension) is nothing but a vector representation of that word (In Lower Dimension).
On a lighter note, the embedding of a particular word (In Higher Dimension) *is nothing but a vector representation of that word *(In Lower Dimension). *Where words with similar meaning *_Ex. “Joyful” and “Cheerful” _and other closely related words like Ex. “Money” and “Bank”, gets closer vector representation when projected in the Lower Dimension.
The transformation from words to vectors is called word embedding
So the underlying concept in creating a mini word embedding boils down to train a simple Auto-Encoder with some text data.
Before we proceed to our creation of mini word embedding, it’s good to brush up our basics concepts of word embedding showered by the deep learning community so far.
The popular and state-of-the-art word embedding models out there are as follows:-
They are trained on a huge amount of text corpus like Wikipedia or entire web is scraped, up to 6 Billion words (In Higher Dimension), and projected them into as low as 100,200,300 dense embeddings (In Lower Dimension).
Here in our model, we project them into 2 dense embeddings.
The above state-of-the-art models use any one of the 2 primary techniques to accomplish the task.
CBOW attempts to guess the output (target word) from its neighboring words (context words). Window size is a hyper-parameter here.
A peek at Alibaba’s Mobile Neural Network (MNN) and how it achieves balance between high performance, flexibility, and ease-of-use.
The past few decades have witnessed a massive boom in the penetration as well as the power of computation, and amidst this information.
“You do not really understand something unless you can explain it to your grandmother” Not sure where this quote originally came from, it is sometimes kind of half-attributed to Albert Einstein.
Deep Learning Explained in Layman's Terms. In this post, you will get to learn deep learning through a simple explanation (layman terms) and examples.
Artificial Neural Networks — Recurrent Neural Networks. Remembering the history and predicting the future with neural networks. A intuition behind Recurrent neural networks.