Empirical behavior of Bayesian network structure learning algorithms

5Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Bayesian network structure learning (BNSL) is the problem of finding a BN structure which best explains a dataset. Score-based learning assigns a score to each network structure. The goal is to find the structure which optimizes the score. We review two recent studies of empirical behavior of BNSL algorithms. The score typically reflects fit to a training dataset; however, models which fit training data well may generalize poorly. Thus, it is not clear that finding an optimal network is worthwhile. We review a comparison of exact and approximate search techniques. Sometimes, approximate algorithms suffice; for complex datasets, the optimal algorithms produce better networks. BNSL is known to be NP-hard, so exact solvers prune the search space using heuristics. We next review problem-dependent characteristics which affect their efficacy. Empirical results show that machine learning techniques based on these characteristics can often be used to accurately predict the algorithms’ running times.

Cite

CITATION STYLE

APA

Malone, B. (2015). Empirical behavior of Bayesian network structure learning algorithms. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9505, pp. 105–121). Springer Verlag. https://doi.org/10.1007/978-3-319-28379-1_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free