Learning theories using estimation distribution algorithms and (reduced) bottom clauses

4Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Genetic Algorithms (GAs) are known for their capacity to explore large search spaces and due to this ability they were applied (to some extent) to Inductive Logic Programming (ILP). Although Estimation of Distribution Algorithms (EDAs) generally perform better than standard GAs, they have not been applied to ILP. This work presents EDA-ILP, an ILP system based on EDA and inverse entailment, and also its extension, the REDA-ILP, which employs the Reduce algorithm in bottom clauses to considerably reduce the search space. Experiments in real-world datasets showed that both systems were successfully compared to Aleph and GA-ILP (another variant of EDA-ILP created replacing the EDA by a standard GA). EDA-ILP was also successfully compared to Progol-QG/GA (and its other variants) in phase transition benchmarks. Additionally, we found that REDA-ILP usually obtains simpler theories than EDA-ILP, more efficiently and with equivalent accuracies. These results show that EDAs provide a good base for stochastic search in ILP. © 2012 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Pitangui, C. G., & Zaverucha, G. (2012). Learning theories using estimation distribution algorithms and (reduced) bottom clauses. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7207 LNAI, pp. 286–301). https://doi.org/10.1007/978-3-642-31951-8_25

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free