The Theory group specialises in the logical, mathematical and statistical foundations of computer science, with a breadth and depth of expertise almost unmatched in the UK. It tackles the hard problems inherent in discovering the power and limitation of computer systems, and how principled design based on the right mathematical models might make them more robust and secure.
Its work has spearheaded several developments – separation logic-based verification, logic for continuous systems, information theory for security, game semantics for program analysis – in which our novel theoretical developments have been brought to bear in new application areas. We have also made fundamental contributions in pure logic (model theory, proof theory, categorical semantics) and in complexity theory.
More details on our work follow below.
Queen Mary is the birth place of Separation Logic, one of the most exciting developments in current Theoretical Computer Science, addressing the 30-year problem of tractable reasoning about data structures held in computer memory. It has led to a powerful new approach to reasoning about shared-memory concurrent programs, methods for combating aliasing in object-oriented programs, and helped reignite worldwide interest in automatic heap verification.
Work by Professor Dino Distefano on tools Space Invader and Abductor produced leading practical results in this area. Tool development is under way in collaboration with Microsoft, who are interested in automated validation of their extensive library of device drivers (whose failure is the cause of most Windows system crashes).
Classical Logic and Sub-structural Logics
Professor Edmund Robinson and Dr Paulo Oliva have been part of the reinvigoration of classical proof theory, working on two fronts. On one side fundamental advances have moved past decades-old stumbling blocks, leading, for the first time, to semantic models that reflect the dynamics of classical sequent calculus. On the other, work in proof mining has continued to extract semantic content from pivotal but non-constructive theorems of mathematics, but also led to novel approaches to game theory, and links between classical game theory and formal logic.
Game theory and backward induction
Dr Paulo Oliva (in joint collaboration with the University of Birmingham) has discovered some fascinating connections between higher-order constructions in proof theory (bar recursion) and the calculation of optimal strategies in a general class of sequential games.
Verification of Concurrent and Large-Scale Software
Dr Michael Tautschnig‘s work is the first to analyse confirm and explain concurrency errors in PostgreSQL. Current work focusses on enabling analyses of entire software ecosystems, such as a Linux distribution.
Compositional Distributional Models of Meaning
The subject pursued by Dr Mehrnoosh Sadrzadeh regards models of natural language that combine two tangential sources of information: statistical data and grammatical structure . Statistical models are pragmatic: they build vectors from real corpora to represent contextual meanings of words. Grammatical models are abstract: they use ordered algebras to represent the general formation rules.
Whereas statistical models have difficulty in representing grammatical structures, logical models do not have much to say about meanings of words. Sadrzadeh’s research combines these two in one model and experiments with the predications; so far it has been very promising! Current goals include advancing the theory to non-context-based words such as pronouns and quantifiers and the experiments to main stream tasks such as discourse semantics and entailment, as well as to other fields such as audio and visual corpora.
Information Theory and Security
Professor Pasquale Malacaria‘s research addresses the question: how can we measure the security of a system, in particular how can we quantify leakage of confidential information? Computer systems are never 100% secure hence to be able to measure their security is crucial.
Recent achievements have been quantification of leaks in the Linux Kernel and verification of the effectiveness of the security patches for these leaks in the Linux Kernel. His most recent work, of a more theoretical nature, has established a fundamental connection between confidentiality and thermodynamics: a secure computation needs to generate a certain amount of heat to do its job properly and the amount of heat generated can be derived by the information theoretical measures developed by Malacaria in his research.
Formal Models and Human Error
Professor Paul Curzon has developed a way to model human behaviour at a level of abstraction suitable for formal reasoning. The models are used to reason about the possibility of people making systematic errors when using interactive systems. On his current project, CHI+MED, he is applying these techniques to the design of medical devices such as infusion pumps.
Game Semantics and Program Analysis
Dr Nikos Tzevelekos has developed a series of fully abstract models of program behaviour by interpreting computation as a formal game between the program and its environment. He is applying this method, called Game Semantics, for attacking hard problems in program analysis, including model checking, control-flow analysis and information flow.
Network Coding and Complexity
Dr Søren Riis‘ work on networks and information theory has identified new laws of information theory that he has shown are needed to identify certain information bottlenecks in communication networks. The ideas and techniques (e.g. graph entropy and guessing games) have been used to settle a number long standing problems in Complexity Theory. This work has been presented at talks at the Isaac Newton Institute, Cambridge.
Professor Ursula Martin has done extensive work on combinatorics, especially in relation to computer algebra.
Continuous Dynamical Systems and Computational Logic
Professor Ursula Martin, Dr Rob Arthan and Dr Paulo Oliva have developed a novel approach to reasoning about continuous dynamical systems, by bridging mathematical systems theory and Hoare logic. This involves deep theoretical questions requiring a unification of models used in computer science and dynamical systems, as well as practical verification problems spurred by contacts with QinetiQ and Intel. Professor Ursula Martin and Dr Alison Pease are investigating current mathematical practice; in particular, ways in which mathematicians are using web 2.0 technology to pose, explore and prove conjectures. They are looking at these new mathematical practices with the aim of developing a road map of research, in which they consider patterns of reasoning which emerge via social interaction and collaborative mathematics.
Dr Nikos Tzevelekos is working on machine models for manipulating infinite alphabets of identifiers, in order to describe computation which involves manipulation and fresh generation of resources. These so called nominal automata find applications in the analysis and verification of languages with generative resources, such as Java and ML.