The information contained in a string x about a string y is the difference between the Kolmogorov complexity of y and the conditional Kolmogorov complexity of y given x, i.e., I(x:y)=C(y)-C(y|x). The Kolmogorov-Levin Theorem says that I(x:y) is symmetric up to a small additive term. We investigate if this property also holds for several versions of polynomial time-bounded Kolmogorov complexity. We study symmetry of information for some variants of distinguishing complexity CD where CD(x) is the length of a shortest program which accepts x and only x. We show relativized worlds where symmetry of information does not hold in a strong way for deterministic and nondeterministic polynomial time distinguishing complexities CDpoly and CNDpoly. On the other hand, for nondeterministic polynomial time distinguishing complexity with randomness, CAMDpoly, we show that symmetry of information holds for most pairs of strings in any set in NP. Our techniques extend work of Buhrman et al. (Language compression and pseudorandom generators, in: Proc. 19th IEEE Conf. on Computational Complexity, IEEE, New York, 2004, pp. 15-28) on language compression by AM algorithms, and have the following application to the compression of samplable sources, introduced in Trevisan et al. (Compression of sample sources, in: Proc. 19th IEEE Conf. on Computational Complexity, IEEE, New York, 2004, pp. 1-15): any element x in the support of a polynomial time samplable source X can be given a description of size -logPr[X=x]+O(log3n), from which x can be recovered by an AM algorithm. © 2005 Elsevier B.V. All rights reserved.
Lee, T., & Romashchenko, A. (2005). Resource bounded symmetry of information revisited. In Theoretical Computer Science (Vol. 345, pp. 386–405). https://doi.org/10.1016/j.tcs.2005.07.017