Séminaire, Sept. 26, 2019

The meeting will take place in Amphi B, 3rd floor, site Monod, ENS de Lyon.


10:30 – 12:00
Pierre Clairambault (CNRS, ENS de Lyon)

Game semantics usually present execution as a total chronological ordering of observable computational events. They proved to be a powerful tool in semantics, yielding a great deal of full abstraction results. Its first few achievements were striking, capturing in a single framework combinations of state and control. This lead to Abramsky's famous "semantic cube" programme: to capture in a single canvas a wide variety of effects. However, past the original "semantic square" resulting from the combinations of control and state, developments in game semantics have since then been rather disconnected.

The family of game semantics usually known as "concurrent games", originally introduced by Abramsky and Melliès in 1999, has been actively developed in the past decade or so. In contrast with the total chronological ordering of traditional games, concurrent games present (implicitly or explicitly) the causal constraints between computational events. To that end they borrow tools from concurrency theory, and in particular rely on structures coming from "true concurrency". They are therefore well suited to the denotational semantics of concurrent programming languages.

In this talk, I will give an introduction to concurrent games and present some developments that highlight a different property: they are good at linking distinct models with each other, in and outside of game semantics. Concretely, I will present the first steps to fixing the cube {state, parallelism, probabilities} -- all its faces are currently broken. First, I will fix the face {state, parallelism}. For that I will introduce a notion of parallel innocence, exploiting the causal aspect of concurrent games. Time permitting, I will also fix {parallelism, probabilities}, via probabilistic concurrent games and their link with relational semantics. I will close on some perspectives on the full cube.

This talk builds on joint work with several people, among whom Simon Castellan, Hugo Paquet and Glynn Winskel.

14:00 – 15:00
Andrea Condoluci (University of Bologna (Italy))

In this talk I will present joint work with Beniamino Accattoli and Claudio Sacerdoti Coen, accepted at PPDP 19. The topic is how to check efficiently whether given -terms are equal, or more precisely -equal: as simple as it may sound, the problem is non-trivial when -terms are represented succintly with sharing, i.e. the reuse of subterms. This problem is relevant in implementations of the -calculus, for instance in proof assistants: type-checking algorithms for dependent type theories rely on a conversion subprocedure to check whether two -terms are convertible, i.e. they reduce to the same term. A lot of research has been carried on how to evaluate -terms in an efficient way, which boils down to use sharing in one way or another; the missing part is how to compare the shared results for equality.

In presence of sharing, one is actually interested in sharing equality, i.e. the equality of the underlying unshared -terms. The naive algorithm for sharing equality is exponential in the size of the shared terms, but in this talk I will present our linear-time algorithm (the first with such low complexity). As others before us, we were inspired by Paterson and Wegman's algorithm for first-order unification, itself based on representing terms with sharing as DAGs, and sharing equality as bisimulation of DAGs.

15:30 – 16:30
Hugo Férée (IRIF, Univ. Paris Diderot)

Complexity theory has been mainly focused on computations over intrinsically finite objects like finite words, integers, trees and graphs, etc. In a broad sense, higher-order complexity deals with computations over more general domains and codomains, including real numbers, co-inductive datatypes or higher-order functions.

I will start by presenting a few results related to second-order complexity which motivated the main course of this talk: a proposition of definition for the complexity of higher-order functions based on game semantics. Finally, I will enumerate some potential applications and extensions of this work that I intend to pursue in the near future.