Word Embeddings

Word embeddings are vectors representing words in a high-dimensional vector space. Here, we look at how such vectors can be trained and what kind of information they encode.

Reading Material