Structured prediction
Structured prediction is an umbrella term for tasks that involve predicting structured outputs, such as sequences or graphs. In this unit, you will learn about two approaches to such tasks: through local search, where the problem is broken down into a sequence of independent classification problems, and through global search, where structured prediction is cast as a combinatorial optimisation problem.
Lectures
The first part of this unit is about sequence labelling, the task of labelling each element of an input sequence with a “tag”, such as its part-of-speech. We specifically cover the Viterbi algorithm, which is central for the global search approach. In the second part of the unit, on dependency parsing, we look into the arc-standard algorithm, which exemplifies the local search approach.
Section | Title | video | slides | quiz |
---|---|---|---|---|
4.1 | Introduction to sequence labelling | video | slides | quiz |
4.2 | Approaches to sequence labelling | video | slides | quiz |
4.3 | The Viterbi algorithm | video | slides | quiz |
4.4 | Introduction to dependency parsing | video | slides | quiz |
4.5 | The arc-standard algorithm | video | slides | quiz |
4.6 | Neural architectures for dependency parsing | video | slides | quiz |
Lab
In this lab, you will implement a simplified version of the dependency parser presented by Glavaš and Vulić (2021). This parser consists of a transformer encoder followed by a bi-affine layer that computes arc scores for all pairs of words. As the encoder, you will use the uncased DistilBERT base model from the Transformers library.