Progress on Minimum Description Length Principle: From Basis to Advanced Topics

  • YAMANISHI K
N/ACitations
Citations of this article
17Readers
Mendeley users who have this article in their library.

Abstract

The minimum description length principle (MDL principle) is a data-compression-based methodology for optimal estimation and prediction from data. It gives a unifying strategy for designing machine learning algorithms and plays an important role in knowledge discovery from big data. Conventionally, the MDL principle has been extensively studied under the assumption that the information sources are stationary and are represented as regular probabilistic models. This paper first gives a survey of the fundamental concept of the MDL principle. Then, it introduces recent advances in MDL research for the situation where the information sources are nonstationary, irregular, and nonprobabilistic. It also shows trends in the nonasymptotic analysis of the MDL and refers to applications to data mining. Key words minimum description length principlenormalized maximum likelihood codestochastic complexitymodel selec-tionlatent variable model 1. Rissanen 1 (minimum description length principle, MDL) MDL @@@@@@@@@@@@@@@@

Cite

CITATION STYLE

APA

YAMANISHI, K. (2017). Progress on Minimum Description Length Principle: From Basis to Advanced Topics. IEICE ESS Fundamentals Review, 10(3), 186–194. https://doi.org/10.1587/essfr.10.3_186

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free