Article contents
Automatic Differentiation in Prolog
Published online by Cambridge University Press: 06 July 2023
Abstract
Automatic differentiation (AD) is a range of algorithms to compute the numeric value of a function’s (partial) derivative, where the function is typically given as a computer program or abstract syntax tree. AD has become immensely popular as part of many learning algorithms, notably for neural networks. This paper uses Prolog to systematically derive gradient-based forward- and reverse-mode AD variants from a simple executable specification: evaluation of the symbolic derivative. Along the way we demonstrate that several Prolog features (DCGs, co-routines) contribute to the succinct formulation of the algorithm. We also discuss two applications in probabilistic programming that are enabled by our Prolog algorithms. The first is parameter learning for the Sum-Product Loop Language and the second consists of both parameter learning and variational inference for probabilistic logic programming.
- Type
- Original Article
- Information
- Theory and Practice of Logic Programming , Volume 23 , Issue 4: 2023 International Conference on Logic Programming , July 2023 , pp. 900 - 917
- Copyright
- © The Author(s), 2023. Published by Cambridge University Press
Footnotes
This project is partly funded by the Flemish Fund for Scientific Research (FWO).
References
- 1
- Cited by