TY - GEN
T1 - A Certified Proof Checker for Deep Neural Network Verification in Imandra
AU - Desmartin, Rémi
AU - Isac, Omri
AU - Passmore, Grant
AU - Komendantskaya, Ekaterina
AU - Stark, Kathrin
AU - Katz, Guy
N1 - Publisher Copyright:
© Remi Desmartin, Omri Isac, Grant Passmore, Ekaterina Komendantskaya, Kathrin Stark, and Guy Katz;
PY - 2025/9/22
Y1 - 2025/9/22
N2 - Recent advances in the verification of deep neural networks (DNNs) have opened the way for a broader usage of DNN verification technology in many application areas, including safety-critical ones. However, DNN verifiers are themselves complex programs that have been shown to be susceptible to errors and numerical imprecision; this, in turn, has raised the question of trust in DNN verifiers. One prominent attempt to address this issue is enhancing DNN verifiers with the capability of producing certificates of their results that are subject to independent algorithmic checking. While formulations of Marabou certificate checking already exist on top of the state-of-the-art DNN verifier Marabou, they are implemented in C++, and that code itself raises the question of trust (e.g., in the precision of floating point calculations or guarantees for implementation soundness). Here, we present an alternative implementation of the Marabou certificate checking in Imandra – an industrial functional programming language and an interactive theorem prover (ITP) – that allows us to obtain full proof of certificate correctness. The significance of the result is two-fold. Firstly, it gives stronger independent guarantees for Marabou proofs. Secondly, it opens the way for the wider adoption of DNN verifiers in interactive theorem proving in the same way as many ITPs already incorporate SMT solvers.
AB - Recent advances in the verification of deep neural networks (DNNs) have opened the way for a broader usage of DNN verification technology in many application areas, including safety-critical ones. However, DNN verifiers are themselves complex programs that have been shown to be susceptible to errors and numerical imprecision; this, in turn, has raised the question of trust in DNN verifiers. One prominent attempt to address this issue is enhancing DNN verifiers with the capability of producing certificates of their results that are subject to independent algorithmic checking. While formulations of Marabou certificate checking already exist on top of the state-of-the-art DNN verifier Marabou, they are implemented in C++, and that code itself raises the question of trust (e.g., in the precision of floating point calculations or guarantees for implementation soundness). Here, we present an alternative implementation of the Marabou certificate checking in Imandra – an industrial functional programming language and an interactive theorem prover (ITP) – that allows us to obtain full proof of certificate correctness. The significance of the result is two-fold. Firstly, it gives stronger independent guarantees for Marabou proofs. Secondly, it opens the way for the wider adoption of DNN verifiers in interactive theorem proving in the same way as many ITPs already incorporate SMT solvers.
KW - Farkas Lemma
KW - Neural Network Verification
KW - Proof Certification
UR - https://www.scopus.com/pages/publications/105019535388
U2 - 10.4230/LIPIcs.ITP.2025.1
DO - 10.4230/LIPIcs.ITP.2025.1
M3 - Conference contribution
AN - SCOPUS:105019535388
T3 - Leibniz International Proceedings in Informatics (LIPIcs)
BT - 16th International Conference on Interactive Theorem Proving (ITP 2025)
A2 - Forster, Yannick
A2 - Keller, Chantal
PB - Schloss Dagstuhl - Leibniz-Zentrum für Informatik
T2 - 16th International Conference on Interactive Theorem Proving 2025
Y2 - 28 September 2025 through 1 October 2025
ER -