Language modelling
Language modelling is about predicting which word comes next in a sequence of words – a seemingly simple task that nevertheless serves as a cornerstone for generating and understanding human language through computers. In this unit, you will learn about two types of language models: \(n\)-gram models and neural models, with a focus on models based on recurrent neural networks.
Video lectures
| Section | Title | |||
|---|---|---|---|---|
| 1.01 | Introduction to language modelling | |||
| 1.02 | N-gram language models | |||
| 1.03 | Neural language models | |||
| 1.04 | Recurrent neural networks (RNNs) | |||
| 1.05 | The LSTM architecture | |||
| 1.06 | RNN language models |
Reading
- Eisenstein (2019), chapter 6
- Christopher Olah’s blog article Understanding LSTM Networks