Abstract
Shannon’s information measures refer to entropy, conditional entropy, mutual information, and conditional mutual information. They are the most important measures of information in information theory. In this chapter, we introduce these measures and establish some basic properties they possess. The physical meanings of these measures will be discussed in depth in subsequent chapters. We then introduce informational divergence which measures the “distance” between two probability distributions and prove some useful inequalities in information theory. The chapter ends with a section on the entropy rate of a stationary information source.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer Science+Business Media New York
About this chapter
Cite this chapter
Yeung, R.W. (2002). Information Measures. In: A First Course in Information Theory. Information Technology: Transmission, Processing and Storage. Springer, Boston, MA. https://doi.org/10.1007/978-1-4419-8608-5_2
Download citation
DOI: https://doi.org/10.1007/978-1-4419-8608-5_2
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4613-4645-6
Online ISBN: 978-1-4419-8608-5
eBook Packages: Springer Book Archive