Information perspective of optimization

7Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper we relate information theory and Kolmogorov Complexity (KC) to optimization in the black box scenario. We define the set of all possible decisions an algorithm might make during a run, we associate a function with a probability distribution over this set and define accordingly its entropy. We show that the expected KC of the set (rather than the function) is a better measure of problem difficulty. We analyze the effect of the entropy on the expected KC. Finally, we show, for a restricted scenario, that any permutation closure of a single function, the finest level of granularity for which a No Free Lunch Theorem can hold [7], can be associated with a particular value of entropy. This implies bounds on the expected performance of an algorithm on members of that closure. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Borenstein, Y., & Poli, R. (2006). Information perspective of optimization. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4193 LNCS, pp. 102–111). Springer Verlag. https://doi.org/10.1007/11844297_11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free