Information theory and rate distortion theory for communications and compression / Jerry Gibson.
Material type:
Item type | Current library | Call number | Status | Date due | Barcode |
---|---|---|---|---|---|
![]() |
Indian Institute of Technology Delhi - Central Library | Available |
Mode of access: World Wide Web.
System requirements: Adobe Acrobat Reader.
Part of: Synthesis digital library of engineering and computer science.
Series from website.
Includes bibliographical references (pages 111-113).
1. Communications, compression and fundamental limits -- 1.1 Shannon's tree theorems -- 1.2 The information transmission theorem or separation theorem -- 1.3 Notes and additional references --
2. Entropy and mutual information -- 2.1 Entropy and mutual information -- 2.2 Chain rules for entropy and mutual information -- 2.3 Differential entropy and mutual information for continuous random variables -- 2.4 Relative entropy and mutual information -- 2.5 Data processing inequality -- 2.6 Notes and additional references --
3. Lossless source coding -- 3.1 The lossless source coding problem -- 3.2 Definitions, properties, and the source coding theorem -- 3.3 Huffman coding and code trees -- 3.4 Elias coding and arithmetic coding -- 3.5 Lempel-Ziv coding -- 3.6 Kraft inequality -- 3.7 The AEP and data compression -- 3.8 Notes and additional references --
4. Channel capacity -- 4.1 The definition of channel capacity -- 4.2 Properties of channel capacity -- 4.3 Calculating capacity for discrete memoryless channels -- 4.4 The channel coding theorem -- 4.5 Decoding and jointly typical sequences -- 4.6 Fano's inequality and the converse to the coding theorem -- 4.7 The additive Gaussian noise channel and capacity -- 4.8 Converse to the coding theorem for Gaussian channels -- 4.9 Expressions for capacity and the Gaussian channel -- 4.9.1 Parallel Gaussian channels [4, 5] -- 4.9.2 Channels with colored Gaussian noise [4, 5] -- 4.10 Band-limited channels -- 4.11 Notes and additional references --
5. Rate distortion theory and lossy source coding -- 5.1 The rate distortion function for discrete memoryless sources -- 5.2 The rate distortion function for continuous amplitude sources -- 5.3 The Shannon lower bound and the optimum backward channel -- 5.3.1 Binary symmetric source -- 5.3.2 Gaussian source -- 5.4 Stationary Gaussian sources with memory -- 5.5 The rate distortion function for a Gaussian autoregressive source -- 5.6 Composite source models and conditional rate distortion functions -- 5.7 The rate distortion theorem for independent Gaussian sources-revisited -- 5.8 Applications of R(D) to scalar quantization -- 5.9 Notes and additional references --
A. Useful inequalities -- B. Laws of large numbers -- B.1 Inequalities and laws of large numbers -- B.1.1 Markov's inequality -- B.1.2 Chebychev's inequality -- B.1.3 Weak law of large numbers -- B.1.4 Strong law of large numbers -- C. Kuhn-Tucker conditions -- Bibliography -- Author's biography.
Abstract freely available; full-text restricted to subscribers or individual document purchasers.
Compendex
INSPEC
Google scholar
Google book search
This book is very specifically targeted to problems in communications and compression by providing the fundamental principles and results in information theory and rate distortion theory for these applications and presenting methods that have proved and will prove useful in analyzing and designing real systems. The chapters contain treatments of entropy, mutual information, lossless source coding, channel capacity, and rate distortion theory; however, it is the selection, ordering, and presentation of the topics within these broad categories that is unique to this concise book.
Also available in print.
Title from PDF title page (viewed on January 14, 2014).
There are no comments on this title.