Séminaire, 13 mars 2014


10:30 – 12:00
Benoît Valiron (PPS, Univ. Paris 7)

The field of quantum algorithms is vibrant. Nevertheless, there is currently a lack of programming languages for describing and formalizing quantum computation on a practical scale, i.e., not just at the level of toy problems. We address this issue by introducing Quipper, a scalable, expressive, functional, higher-order quantum programming language. Quipper has been used to program a diverse set of non-trivial quantum algorithms, and is able to generate quantum gate representations using trillions of gates.

In this talk, I will present the setting and the current (limited) semantics for quantum computation. I will introduce Quipper and discuss what is currently missing, that is, a linear type system to capture the non-duplicability of quantum data. From there I will present two uses for this type-system: (1) an denotational semantics for all of quantum computation, and (2) resource estimation for quantum algorithms.

14:00 – 15:00
Pierre-Marie Pédrot (PPS, Univ. Paris 7)

The Dialectica translation is a logical transformation described by Gödel in 1958, but designed in the 30's. At the end of the 80's, it was given a categorical counterpart, which happened to be compatible with the usual decomposition of intuitionistic logic into linear logic. Still, it was lacking a true Curry-Howard interpretation.

We will fill this hole by presenting the computational content of Dialectica by means of an untyped lambda-calculus translation. We will show that this translation has a really simple explanation as soon as we put our source term in the Krivine abstract machine, except for a disturbing detail, seemingly deeply rooted in linear logic. We will also show how our presentation can be naturally applied to a dependently-typed system almost without adaptation, thus giving a hindsight on how linear dependent types may be built (or not).

15:30 – 16:30
Hugo Férée (LORIA)

There are numerous notions of computability for non-standard structures, like higher order functions or real number computations. However, most of their underlying computational model lack a natural notion of complexity. Indeed, if we understand well what the complexity of a first order or even second order functions is, it is far from being the case for other, more complex objects.

I will describe a few existing computational models, especially for higher order functions, to see why they are not complexity-friendly. Then, I will propose a general framework based on game semantics to define meaningful notions of size and complexity for quite general objects and provide several examples to illustrate it.