Damiano Mazza (CNRS, Univ. Paris 13), Automatic Differentiation in PCF

Programme

Résumé

Automatic differentiation (AD) is the science of efficiently computing the derivative (or gradient, or Jacobian) of functions specified by computer programs. It is a fundamental tool in several fields, most notably machine learning, where it is the key for training deep neural networks. Albeit AD techniques natively focus on a restricted class of programs, namely first-order straight-line programs, the rise of so-called differentiable programming in recent years has urged for the need of applying AD to complex programs, endowed with control flow operators and higher-order combinators, such as map and fold. In this talk, I will discuss the extension of AD algorithms to PCF, a(n idealized) purely functional programming language. We will first consider the simply-typed lambda-calculus, showing in particular how linear negation is related to reverse mode AD (aka backpropagation), and then see how the extra features of PCF, namely full recursion and conditionals, may be dealt with, stressing the difficulties posed by the latter.

[Joint work with Aloïs Brunel (Deepomatic) and Michele Pagani (IRIF, Université de Paris)]