Word embeddings use an algorithm to train fixed-length dense vectors and continuous-valued vectors based on a large text corpus. Each word represents a point in vector space, and these points are learned and moved around the target word by preserving semantic relationships.
Read more: https://analyticsindiamag.com/hands-on-guide-to-word-embeddings-using-glove/