Webpage of the Compositional Systems and Methods group at TalTech.

06 August 2020, 17:00, online


Jonathan Gallagher, University of Calgary


Introduction to differential categories


The field of automatic differentiation is experiencing a surge of interest due to connections with gradient descent and machine learning. The gradient descent technique is being employed to tune the parameters of more general sorts of programs than neural networks including recurrent neural networks and even arbitrary programs. The only assumption is that the program represents a differentiable function, and the emerging field is being called differential programming.

We will introduce Cartesian differential categories and differential restriction categories as a categorical framework where every map has a derivative. Recent work in tools such as TensorFlow eager mode, PyTorch, etc, is moving towards treating differential programming as programming in a domain specific language: a differential programming language. The axioms of differential categories and their variants are all useful in describing the categorical semantics of differential programming languages. In fact, one axiom has led to an exponential speedup in differentiating recursive programs.