Elements of Information Theory
The latest edition of this classic is updated with new problem sets and material
The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory.
All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points.
The Second Edition features:
* Chapters reorganized to improve teaching
* 200 new problems
* New material on source coding, portfolio theory, and feedback capacity
* Updated references
Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.
Cosa dicono le persone - Scrivi una recensione
Nessuna recensione trovata nei soliti posti.
Asymptotic Equipartition Property
Entropy Rates of a Stochastic Process
Gambling and Data Compression
Information Theory and Statistics
Universal Source Coding
Network Information Theory
Information Theory and Portfolio Theory
Inequalities in Information Theoq
List of Symbols
Rate Distortion Theoq
Altre edizioni - Visualizza tutto
achievable algorithm alphabet assume asymptotic average binary symmetric channel bits broadcast channel calculate capacity region channel capacity channel coding codebook codeword codeword lengths coding theorem conditional entropy Consider convex corresponding data compression defined Definition denote density describe differential entropy discrete memoryless channel distortion measure drawn i.i.d. encoding entropy rate equal equation ergodic estimate example Fano’s inequality Fisher information Gaussian channel given growth rate Hence Huffman code IEEE Trans independent information theory input joint distribution jointly typical Kolmogorov complexity large numbers Lemma Let X1 lower bound Markov chain matrix maximizes maximum entropy minimization multiple-access channel mutual information node noise output power constraint probability mass function probability of error problem proof prove random variable rate distortion function relative entropy satisfying sender Shannon side information source coding stochastic process string symbols typical sequences typical set uniform distribution uniquely decodable vector wealth