Abstract
Differentiable logics (DL) have recently been proposed as a method of training neural networks to satisfy logical specifications. A DL consists of a syntax in which specifications are stated and an interpretation function that translates expressions in the syntax into loss functions. These loss functions can then be used during training with standard gradient descent algorithms. The variety of existing DLs and the differing levels of formality with which they are treated makes a systematic comparative study of their properties and implementations difficult. This paper remedies this problem by suggesting a meta-language for defining DLs that we call the Logic of Differentiable Logics, or LDL. Syntactically, it generalises the syntax of existing DLs to FOL, and for the first time introduces the formalism for reasoning about vectors and learners. Semantically, it introduces a general interpretation function that can be instantiated to define loss functions arising from different existing DLs. We use LDL to establish several theoretical properties of existing DLs and to conduct their empirical study in neural network verification. Keywords: Differentiable Logic, Fuzzy Logic, Probabilistic Logic, Machine Learning, Training with Constraints, Types.
Original language | English |
---|---|
Pages (from-to) | 473-493 |
Number of pages | 21 |
Journal | EPiC Series in Computing |
Volume | 94 |
DOIs | |
Publication status | Published - 3 Jun 2023 |
Event | 24th International Conference on Logic for Programming, Artificial Intelligence and Reasoning 2023 - Manizales, Colombia Duration: 4 Jun 2023 → 9 Jun 2023 |
Keywords
- declarative semantics
- Differentiable Logic
- Fuzzy Logic
- machine learning
- probabilistic logic
- types
ASJC Scopus subject areas
- General Computer Science