skip to primary navigationskip to content

Department of Computer Science and Technology

Masters

 

Course pages 2023–24

Introduction to Computational Semantics

Principal lecturers: Dr Weiwei Sun, Prof Simone Teufel
Taken by: MPhil ACS, Part III
Code: L98
Term: Michaelmas
Hours: 16 (8 x 2 hour lectures)
Format: In-person lectures, student presentations, and group discussions
Class limit: max. 16 students
Moodle, timetable

Aims

This is a lecture-style course that introduces students to various aspects of the semantics of Natural Languages (mainly English):

  • Lexical Semantics, with an emphasis on theory and phenomenology (4 sessions)
  • Compositional Semantics (9 sessions)
  • Discourse and pragmatics-related aspects of semantics (3 sessions)

Learning outcomes

  • Give an operational definition of what is meant by “meaning” (for instance, above and beyond syntax);
  • Name the types of phenomena in language that require semantic consideration, in terms of lexical, compositional and discourse/pragmatic aspects, in other words, argue why semantics is important;
  • Demonstrate an understanding of the basics of various semantic representations, including logic-based and graph-based semantic representations, their properties, how they are used and why they are important, and how they are different from syntactic representations;
  • Know how such semantic representations are derived during or after parsing, and how they can be analysed and mapped to surface strings;
  • Understand applications of semantic representations e.g. reasoning, validation, and methods how these are approached.
  • When designing NL tasks that clearly require semantic processing (e.g. knowledge-based QA), to be aware of and reuse semantic representations and algorithms when designing the task, rather than reinventing the wheel.

Practical advantages of this course for NLP students

  • Knowledge of underlying semantic effects helps improve NLP evaluation, for instance by providing more meaningful error analysis. You will be able to link particular errors to design decisions inside your system.
  • You will learn methods for better benchmarking of your system, whatever the task may be. Supervised ML systems (in particular black-box systems such as Deep Learning) are only as clever as the datasets they are based on. In this course, you will learn to design datasets so that they are harder to trick without real understanding, or critique existing datasets.
  • You will be able to design tests for ML systems that better pinpoint which aspects of language an end-to-end system has “understood”.
  • You will learn to detect ambiguity and ill-formed semantics in human-human communication. This can serve to write more clearly and logically.
  • You will learn about decomposing complex semantics-reliant tasks sensibly so that you can reuse the techniques underlying semantic analyzers in a modular way. In this way, rather than being forced to treat complex tasks in an end-to-end manner, you will be able to profit from partial explanations and a better error analysis already built into the system.

Syllabus

  • Week 1    
    • Two hour lecture: General introduction + Event     
    • Assignment 1: reading papers and corpora about crowd sourcing annotations 
  • Week 2    
    • One hour lecture: Referentiality    
    • One hour student presentation + discussion on homework 1     
    • Reading assignment: coreference 
  • Week 3     
    • One hour lecture: Truth-conditional semantics     
    • One hour student presentation + discussion on coreference 
  • Week 4     
    • Two hour lecture: Graph-based MR + Semantic parsing     
    • Assignment 2: evaluating cross-lingual semantic parsers 
  • Week 5     
    • Two hour lecture: Compositionally + Weakly compositional phenomena    
    • Reading assignment: code generation 
  • Week 6     
    • One hour lecture: i. Negation (including presupposition and pragmatics)     
    • One hour student presentation + discussion on code generation 
  • Week 7     
    • One hour lecture: English Resource Semantics     
    • One hour student presentation + discussion on homework 2     
    • Assignment 3: task-specific semantic parsing as cross-lingual parsing 
  • Week 8     
    • 1.5 hour lecture: Lexical semantics + Grounding     
    • 0.5 hour summary of the course  

Assessment

  • Assignment 1: ticked exercise 5%
  • Assignment 2: 30%
  • Assignment 3: 45%
  • Presentations: Practice presentation 5%, Final presentation 15%