Mixtures of probabilistic logic programs

Structure learning (SL) is a fundamental task in Statistical Relational Artificial Intelligence, where the goal is to learn a program from data. Among the possible target languages, there is Probabilistic Logic Programming. Mixture models have recently gained attention thanks to their effectiveness...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:International journal of approximate reasoning Ročník 186; s. 109497
Hlavný autor: Azzolini, Damiano
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Elsevier Inc 01.11.2025
Predmet:
ISSN:0888-613X
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Structure learning (SL) is a fundamental task in Statistical Relational Artificial Intelligence, where the goal is to learn a program from data. Among the possible target languages, there is Probabilistic Logic Programming. Mixture models have recently gained attention thanks to their effectiveness in modeling complex distributions by combining simpler ones. In this paper, we propose learning a mixture of probabilistic logic programs to handle SL. Our method consists of three steps: 1) generating mixture components with a specific structure, 2) applying parameter learning to each component, and 3) optimizing the weights associated with each component. Furthermore, to possibly reduce the number of components and mitigate overfitting, we also explore the use of L1 and L2 regularization. Empirical results obtained by considering both the full set of components and only a fraction of them demonstrate that our approach, despite being seemingly simple, is competitive with state-of-the-art solvers. •Mixtures of probabilistic logic programs are effective for structure learning.•Even considering a small fraction of the total components yields good results.•Higher likelihood does not necessarily imply higher AUC-ROC or AUC-PR.
ISSN:0888-613X
DOI:10.1016/j.ijar.2025.109497