Department of Computer Science and Technology

Course pages 2017–18

Natural Language Processing

Principal lecturer: Prof Ann Copestake
Taken by: Part II
Past exam questions

No. of lectures: 12
Suggested hours of supervisions: 3
Prerequisite courses: Logic and Proof, Artificial Intelligence


This course introduces the fundamental techniques of natural language processing. It aims to explain the potential and the main limitations of these techniques. Some current research issues are introduced and some current and potential applications discussed and evaluated.


The order of delivery of the lectures is provisional.

  • Introduction. Brief history of NLP research, some current applications, components of NLP systems.

  • Finite-state techniques. Inflectional and derivational morphology, finite-state automata in NLP, finite-state transducers.

  • Prediction and part-of-speech tagging. Corpora, simple N-grams, word prediction, stochastic tagging, evaluating system performance.

  • Context-free grammars and parsing. Generative grammar, context-free grammars, parsing with context-free grammars, weights and probabilities. Some limitations of context-free grammars.

  • Dependencies. Dependency structures. English as an outlier. Universal dependencies. Introduction to dependency parsing.

  • Compositional semantics. Compositional semantics. Logical representations. Compositional semantics and lambda calculus. Inference and robust entailment. Negation.

  • Lexical semantics. Semantic relations, WordNet, word senses.

  • Distributional semantics. Representing lexical meaning with distributions. Similarity metrics.

  • Distributional semantics and deep learning. Embeddings. Grounding. Multimodal systems and visual question answering.

  • Discourse processing. Anaphora resolution, summarization.

  • Language generation and regeneration. Components of a generation system. Summarisation. Generation of referring expressions.

  • Recent NLP research. Some recent NLP research.


At the end of the course students should

  • be able to discuss the current and likely future performance of several NLP applications;

  • be able to describe briefly a fundamental technique for processing language for several subtasks, such as morphological processing, parsing, summarization etc;

  • understand how these techniques draw on and relate to other areas of computer science.

Recommended reading

* Jurafsky, D. & Martin, J. (2008). Speech and language processing. Prentice Hall.

Although the NLP lectures don’t assume any exposure to linguistics, the course will be easier to follow if students have some understanding of basic linguistic concepts. The following may be useful for reference purposes:

The Internet Grammar of English,