Abstract
Automatic differentiation (AD) has been a topic of interest for researchers in many disciplines, with increased popularity since its application to machine learning and neural networks. Although many researchers appreciate and know how to apply AD, it remains a challenge to truly understand the underlying processes. From an algebraic point of view, however, AD appears surprisingly natural: it originates from the differentiation laws. In this work we use Algebra of Programming techniques to reason about different AD variants, leveraging Haskell to illustrate our observations. Our findings stem from three fundamental algebraic abstractions: (1) the notion of semimodule, (2) Nagata's construction of the ‘idealization of a module’, and (3) Kronecker's delta function, that together allow us to write a single-line abstract definition of AD. From this single-line definition, and by instantiating our algebraic structures in various ways, we derive different AD variants, that have the same extensional behaviour, but different intensional properties, mainly in terms of (asymptotic) computational complexity. We show the different variants equivalent by means of Kronecker isomorphisms, a further elaboration of our Haskell infrastructure which guarantees correctness by construction. With this framework in place, this paper seeks to make AD variants more comprehensible, taking an algebraic perspective on the matter.
Original language | English |
---|---|
Article number | 103010 |
Journal | Science of Computer Programming |
Volume | 231 |
Early online date | 15 Aug 2023 |
DOIs | |
Publication status | Published - Jan 2024 |
Keywords
- Automatic differentiation
- Cayley representation
- Kronecker
- Nagata idealization
- Semimodules
ASJC Scopus subject areas
- Software
- Information Systems
- Modelling and Simulation
- Computational Theory and Mathematics