Hierarchical Methods in Computer Graphics

Professor Hans-Peter Seidel - University of Erlangen

A central problem in computer graphics is the enormous size of the data sets that need to be processed. With the development of ever more powerful modeling and simulation tools and with the increasing availability of high-resolution 3D scanners and advances in medical imaging this problem will become even more severe in the future.

In order to deal with these huge amount of data, hierarchical methods, multiresolution representations, and wavelets are currently evolving into a core technique in computer graphics. Their power lies in the fact that they only require a small number of coefficients to represent complex functions and large data sets accurately. This leads to new compression algorithms and efficient computations by exploiting smoothness and coherence. Examples of their use in computer graphics include

Several examples from ongoing projects illustrate the approach and demonstrate the strength of the underlying concepts.

Security Protocols and Their Correctness

Dr Larry Paulson - University of Cambridge Computer Laboratory

Security protocols are used in the Internet, mobile phones, digital payment systems, etc. Their goals may be to keep data secret, to preserve it from tampering, or to prevent intruders from assuming somebody else's name. A faulty protocol can be attacked by simple means, such as replaying parts of old sessions, without brute-force codebreaking.

Researchers have developed tools to search for such attacks. However, failure to find attacks does not mean that a protocol is correct. Protocols and their goals are seldom specified formally, which makes it hard to say whether they are correct, even when possible attacks are pointed out.

The speaker will outline recent approaches to showing correctness, taking as an example a simple public-key protocol.

Automatic parallel programming on the Delphi Machine

Ian Lewis - University of Cambridge Computer Laboratory

This lecture could equally have been titled "How to search a large maze with 42 people". The execution of logic programs in languages such as pure Prolog can be interpreted as the systematic search of an equivalent irregular acyclic graph. The lecture describes a method of automatically dividing the search among a number of processors with the minimum of communication. The technique has an inherent resilience against processor failure and is suitable for implementation with general purpose workstations and networks. The PrologPF implementation at Cambridge is described, and the results discussed.

Hardware Security - Smartcards and other Tamper-Resistant Modules

Markus Kuhn - University of Cambridge Computer Laboratory

Many computer security applications depend on the secure storage of secret key material. The processors storing these keys cannot be protected by walls and guards in applications such as digital purses or pay-TV encryption systems; often the key memory has to be given into the hands of the attacker. Smartcards and other tamper-resistant processors are frequently quoted as a solution for this problem, but there is little published material about how difficult it is for attackers to circumvent the physical protection of these low-cost devices. The talk will discuss various techniques that have been applied to break the security processors used in pay-TV encryption systems and digital purses with much less effort then the manufacturers had hoped.

Activity-Centred Retrieval and Visualisation of Heterogeneous Data

Matthew Chalmers - Ubilab - Zurich

Popular sources of information such as the World Wide Web are predominantly made up of heterogeneous mixtures of information types such as text, graphics, sound and so forth. A number of approaches to automatically accessing heterogeneous and complex information have recently been experimented with, and this presentation discusses an approach based on collaborative filtering. The 'path model' extends collaborative filtering by taking into account the context of information access in terms of the temporal order of access by each user. Retrieval and relevance are driven by consistencies in usage histories of, for example, a web browser.

In the first section of this talk, the path model is described along with related tools for URL recommendation and for visualisation of sets of URLs. This is followed by a discussion of the path model, in particular an epistemologically-based comparison of this approach with other information access approaches, and putting forward ideas as to their combination and interdependence.

Network computing: programming the WWW with coordination technology

Paolo Ciancarini - University of Bologna

Using Java the WWW can be seen as a programming platform useful to build Internet-based services and multiuser applications. In fact, applications in which proactive or reactive processing and code mobility are required, like groupware or workflow, are especially suited to be at least monitored through the Web, and in several cases they can be fully controlled. We show how we design this kind of applications, exploting novel coordination models suitable to build software architectures including different flavors of code mobility.