Word Embeddings
Word embeddings are vectors representing words in a high-dimensional vector space. Here, we look at how such vectors can be trained and what kind of information they encode.
Lecture Slides
Please note that the exact content of the slides may still change before the lecture.
Reading Material
- Chapter 6 of Jurafsky & Martin (2024)