Word Embeddings
Word embeddings are vectors representing words in a high-dimensional vector space. Here, we look at how such vectors can be trained and what kind of information they encode.
Reading Material
- Chapter 6 of Jurafsky & Martin (2024)
This page is still being built up for the upcoming course session of Spring 2025.
Word embeddings are vectors representing words in a high-dimensional vector space. Here, we look at how such vectors can be trained and what kind of information they encode.