Worst case complexity of problems with random information noise

15Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We study the worst case complexity of solving problems for which information is partial and contaminated by random noise. It is well known that if information is exact then adaption does not help for solving linear problems, i.e., for approximating linear operators over convex and symmetric sets. On the other hand, randomization can sometimes help significantly. It turns out that for noisy information, adaption may lead to much better approximations than nonadaption, even for linear problems. This holds because, in the presence of noise, adaption is equivalent to randomization. We present sharp bounds on the worst case complexity of problems with random noise in terms of the randomized complexity with exact information. The results obtained are applied to the d-variate integration and L∞-approximation of functions belonging to Hölder and Sobolev classes. Information is given by function evaluations with Gaussian noise of variance σ2. For exact information, the two problems are intractable since the complexity is proportional to (1/ε)q where q grows linearly with d. For noisy information the situation is different. For integration, the ε-complexity is of order σ2/ε2 as ε goes to zero. Hence the curse of dimensionality is broken due to random noise. for approximation, the complexity is of order σ2(1/ε)q+2 ln(1/ε), and the problem is intractable also with random noise. © 1996 Academic Press, Inc.

Cite

CITATION STYLE

APA

Plaskota, L. (1996). Worst case complexity of problems with random information noise. Journal of Complexity, 12(4), 416–439. https://doi.org/10.1006/jcom.1996.0026

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free