Computer Laboratory

Course pages 2014–15

Natural Language Processing

Principal lecturer: Prof Ann Copestake
Taken by: Part II
Past exam questions
Information for supervisors (contact lecturer for access permission)

No. of lectures: 12
Suggested hours of supervisions: 3
Prerequisite courses: Mathematical Methods for Computer Science, Logic and Proof, and Artificial Intelligence I


This course introduces the fundamental techniques of natural language processing. It aims to explain the potential and the main limitations of these techniques. Some current research issues are introduced and some current and potential applications discussed and evaluated.


The order of delivery of the lectures is provisional.

  • Introduction. Brief history of NLP research, current applications, components of NLP systems.

  • Finite-state techniques. Inflectional and derivational morphology, finite-state automata in NLP, finite-state transducers.

  • Prediction and part-of-speech tagging. Corpora, simple N-grams, word prediction, stochastic tagging, evaluating system performance.

  • Context-free grammars and parsing. Generative grammar, context-free grammars, parsing with context-free grammars, weights and probabilities. Limitations of context-free grammars.

  • Constraint-based grammars. Constraint-based grammar, unification.

  • Compositional semantics. Simple compositional semantics in constraint-based grammar. Compositional semantics with lambda calculus. Inference and robust entailment.

  • Lexical semantics. Semantic relations, WordNet, word senses, word sense disambiguation.

  • Distributional semantics. Representing lexical meaning with distributions. Similarity metrics. Clustering.

  • Discourse and dialogue. Anaphora resolution, discourse relations.

  • Language generation. Generation and regeneration. Components of a generation system. Generation of referring expressions.

  • Computational psycholinguistics. Modelling human language use.

  • Applications. Examples of practical applications of NLP techniques.


At the end of the course students should

  • be able to discuss the current and likely future performance of several NLP applications;

  • be able to describe briefly a fundamental technique for processing language for several subtasks, such as morphological processing, parsing, word sense disambiguation etc.;

  • understand how these techniques draw on and relate to other areas of computer science.

Recommended reading

* Jurafsky, D. & Martin, J. (2008). Speech and language processing. Prentice Hall.

For background reading, one of:
Pinker, S. (1994). The language instinct. Penguin.
Matthews, P. (2003). Linguistics: a very short introduction. OUP.

Although the NLP lectures don’t assume any exposure to linguistics, the course will be easier to follow if students have some understanding of basic linguistic concepts.

For reference purposes:
The Internet Grammar of English,