Abstract
There are several sorts of Kolmogorov complexity, better to say several Kolmogorov complexities: decision complexity, simple complexity, prefix complexity, monotonie complexity,a priori complexity. The last three can and the first two cannot be used for defining randomness of an infinite binary sequence. All those five versions of Kolmogorov complexity were considered, from a unified point of view, in a paper by the first author which appeared in Watanabe's book [23]. Upper and lower bounds for those complexities and also for their differences were announced in that paper without proofs. (Some of those bounds are mentioned in Section 4.4.5 of [16].) The purpose of this paper (which can be read independently of [23]) is to give proofs for the bounds from [23].
The terminology used in this paper is somehow nonstandard: we call “Kolmogorov entropy” what is usually called “Kolmogorov complexity.” This is a Moscow tradition suggested by Kolmogorov himself. By this tradition the term “complexity” relates toany mode of description and “entropy” is the complexity related to anoptimal mode (i.e., to a mode that, roughly speaking, gives theshortest descriptions).
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
C. Calude,Information and Randomness. An Algorithmic Perspective. Springer-Verlag, Auckland, in press.
G. J. Chaitin, On the length of programs for computing finite binary sequences,J. Assoc. Comput. Mach.,13 (1966), 547–569.
G. J. Chaitin, On the length of programs for computing finite binary sequences: statistical considerations,J. Assoc. Comput. Mach.,16 (1969), 145–159.
G. J. Chaitin, A theory of program size formally identical to information theory,J. Assoc. Comput. Mach.,22 (1975), 329–340.
G. J. Chaitin,Algorithmic Information Theory, Cambridge University Press, Cambridge, 1987.
G. J. Chaitin,Information, Randomness and Incompleteness. Papers on Algorithmic Information Theory, World Scientific, Singapore, 1987; expanded second edition in 1992.
G.J. Chaitin, Foreword to [1].
P. Gács, On the symmetry of algorithmic information,Soviet Math. Dokl.,15 (1974), 1477–1480. (Translated from the Russian version.)
P. Gács, On the relation between descriptional complexity and algorithmic probability.Theoret. Comput. Sci.,22 (1983), 71–93.
P. Gács, A review of [5],J. Symbolic Logic,54(2) (1989), 624–627.
P. Gács, Personal communication, April, 1993.
A. N. Kolmogorov, Three approaches to the quantitative definition of information,Problems Inform. Transmission,1(1) (1965), 1–7. (Translated from the Russian version.)
L. A. Levin, On the notion of arandom sequence,Soviet Math. Dokl.,14 (1973), 1413–1416. (Translated from the Russian version.)
L. A. Levin, Laws of information conservation (non-growth) and aspects of the foundation of probability theory,Problems Inform. Transmission,10(3) (1974), 206–210. (Translated from the Russian version.)
L. A. Levin, Various measures of complexity for finite objects (axiomatic description),Soviet Math. Dokl.,17 (1976), 522–526. (Translated from the Russian version.)
M. Li and P. Vitányi,An Introduction to Kolmogorov Complexity and Its Applications, Springer-Verlag, New York, 1993.
D. W. Loveland, A variant of the Kolmogorov concept of complexity,Inform. and Control,15 (1969), 602–619.
A. A. Markov, On normal algorithms which compute Boolean functions,Soviet Math. Dokl.,5 (1964), 922–924. (Translated from the Russian version.)
C. P. Schnorr, Process complexity and effective random tests,J. Comput. System Sci.,7 (1973), 376–388.
C. P. Schnorr, A survey of the theory of random sequences. In R. E. Butts and J. Hintikka (eds.),Basic Problems in Methodology and Linguistics, Reidel, Dordrecht, 1977, pp. 193–210.
A. Kh. Shen, Algorithmic variants of the notion of entropy,Soviet Math. Dokl.,29(3) (1984), 569–573. (Translated from the Russian version.)
R. Solomonoff. A formal theory of inductive inference, Part I,Inform. and Control,7 (1964), 1–22.
V. A. Uspensky, Complexity and entropy: an introduction to the theory of Kolmogorov complexity. In O. Watanabe (ed.),Kolmogorov Complexity and Computational Complexity, Springer-Verlag, New York, 1992.
V. A. Uspensky, A. L. Semenov, and A. Kh. Shen, Can an individual sequence of zeros and ones be random?,Russian Math. Surveys,45(1) (1990), 121–189. (Translated from the Russian version.)
V. V. V'yugin. Algorithmic entropy (complexity) of finite objects and its application to defining randomness and amount of information,Semiotika i Informatika,16 (1981), 14–43. (English translation:Selecta Mathematica formerly Sovietica, 13(4) (1994), 357–389.)
A. K. Zvonkin and L. A. Levin, The complexity of finite objects and the developments of the concepts of information and randomness by means of the theory of algorithms,Russian Math. Surveys,25(6) (1970), 83–124. (Translated from the Russian version.)
Author information
Authors and Affiliations
Additional information
The research described in this publication was made possible in part by Grant No. MQ3000 from the International Science Foundation. A preliminary version of this paper appeared in April, 1993, as CWI's Technical Report CS-R9329 (CWI is the Dutch National Research Institute for Mathematics and Computer Science). That version was supported by the Netherlands Organization for Scientific Research (NWO). The second author was also supported by AMS fSU Aid fund and Volkswagen Stiftung.
Rights and permissions
About this article
Cite this article
Uspensky, V.A., Shen, A. Relations between varieties of kolmogorov complexities. Math. Systems Theory 29, 271–292 (1996). https://doi.org/10.1007/BF01201280
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF01201280