Elements of Information Theory, Volume 1John Wiley & Sons, 18 lug 2006 - 784 pagine The latest edition of this classic is updated with new problem sets and material The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory. All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points. The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated references Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications. |
Sommario
Entropy Relative Entropy and Mutual Information | 13 |
3 | 44 |
Entropy Rates of a Stochastic Process | 71 |
4 | 78 |
Summary | 87 |
Historical Notes | 100 |
6 | 107 |
Summary | 141 |
Universal Source Coding | 427 |
Kolmogorov Complexity | 463 |
8 | 484 |
Summary | 501 |
Historical Notes | 507 |
2 | 509 |
71 | 531 |
MultipleAccess Channel | 532 |
Historical Notes | 157 |
Gambling and Data Compression | 159 |
Historical Notes | 182 |
57 | 198 |
Theorem | 208 |
Differential Entropy | 243 |
Gaussian Channel | 261 |
Rate Distortion Theory | 301 |
Information Theory and Statistics | 347 |
Maximum Entropy | 409 |
103 | 539 |
MultipleAccess Channels | 558 |
Relay Channel | 571 |
9 | 582 |
Rate Distortion with Side Information 580 | 596 |
Information Theory and Portfolio Theory | 613 |
Inequalities in Information Theory | 657 |
Inequality | 674 |
689 | |
List of Symbols | 723 |
Altre edizioni - Visualizza tutto
Parole e frasi comuni
achievable algorithm alphabet assume average binary symmetric channel bits broadcast channel calculate capacity region channel capacity codebook codeword codeword lengths conditional Consider convex corresponding data compression defined Definition denote density describe differential entropy doubling rate drawn i.i.d. encoding entropy rate equal estimate example expected feedback Find finite Fisher information follows Gaussian channel given Hence Huffman code independent information theory input integer joint distribution jointly typical Kolmogorov complexity large numbers Lemma Let X1 log p(x lower bound Markov chain matrix maximizing maximum entropy minimization multiple-access channel mutual information node noise output P₁ parsing power constraint probability distribution probability mass function probability of error problem proof prove Pu(x random variable rate distortion function relative entropy sample satisfying sender source coding stationary stochastic process string symbol theorem typical sequences typical set vector X₁