TY - GEN
T1 - Taming Differentiable Logics with Coq Formalisation
AU - Affeldt, Reynald
AU - Bruni, Alessandro
AU - Komendantskaya, Ekaterina
AU - Slusarz, Natalia
AU - Stark, Kathrin
N1 - Publisher Copyright:
© 2024 Reynald Affeldt, Alessandro Bruni, Ekaterina Komendantskaya, Natalia Slusarz, and Kathrin Stark.
PY - 2024/9/2
Y1 - 2024/9/2
N2 - For performance and verification in machine learning, new methods have recently been proposed that optimise learning systems to satisfy formally expressed logical properties. Among these methods, differentiable logics (DLs) are used to translate propositional or first-order formulae into loss functions deployed for optimisation in machine learning. At the same time, recent attempts to give programming language support for verification of neural networks showed that DLs can be used to compile verification properties to machine-learning backends. This situation is calling for stronger guarantees about the soundness of such compilers, the soundness and compositionality of DLs, and the differentiability and performance of the resulting loss functions. In this paper, we propose an approach to formalise existing DLs using the Mathematical Components library in the Coq proof assistant. Thanks to this formalisation, we are able to give uniform semantics to otherwise disparate DLs, give formal proofs to existing informal arguments, find errors in previous work, and provide formal proofs to missing conjectured properties. This work is meant as a stepping stone for the development of programming language support for verification of machine learning.
AB - For performance and verification in machine learning, new methods have recently been proposed that optimise learning systems to satisfy formally expressed logical properties. Among these methods, differentiable logics (DLs) are used to translate propositional or first-order formulae into loss functions deployed for optimisation in machine learning. At the same time, recent attempts to give programming language support for verification of neural networks showed that DLs can be used to compile verification properties to machine-learning backends. This situation is calling for stronger guarantees about the soundness of such compilers, the soundness and compositionality of DLs, and the differentiability and performance of the resulting loss functions. In this paper, we propose an approach to formalise existing DLs using the Mathematical Components library in the Coq proof assistant. Thanks to this formalisation, we are able to give uniform semantics to otherwise disparate DLs, give formal proofs to existing informal arguments, find errors in previous work, and provide formal proofs to missing conjectured properties. This work is meant as a stepping stone for the development of programming language support for verification of machine learning.
KW - Differentiable Logics
KW - Interactive Theorem Proving
KW - Logic and Semantics
KW - Loss Functions
KW - Machine Learning
UR - http://www.scopus.com/inward/record.url?scp=85204083450&partnerID=8YFLogxK
U2 - 10.4230/LIPIcs.ITP.2024.4
DO - 10.4230/LIPIcs.ITP.2024.4
M3 - Conference contribution
AN - SCOPUS:85204083450
T3 - Leibniz International Proceedings in Informatics (LIPIcs)
BT - 15th International Conference on Interactive Theorem Proving (ITP 2024)
A2 - Bertot, Yves
A2 - Kutsia, Temur
A2 - Norrish, Michael
PB - Schloss Dagstuhl - Leibniz-Zentrum für Informatik
T2 - 15th International Conference on Interactive Theorem Proving 2024
Y2 - 9 September 2024 through 14 September 2024
ER -