Please enable JavaScript.
Coggle requires JavaScript to display documents.
Information Theory (Variable length code (Non-singular Codes, Uniquely…
Information Theory
Variable length code
Non-singular Codes
Uniquely Decodable Codes
Prefix Codes
Binary Prefix Code
Universal Lossless Variable-length Code
Block Codes (for Asymptotically Lossless Compression)
Block Codes for Discrete Memoryless Sources
Block Codes for Stationary Ergonic Sources
Redundancy for lossless Block Data Compression
Single R.V.
Self-information & Entropy
Self-information
Represent the amount of information one gains when learning that E has occurred (or equivalently, the amount of uncertainty one had about E prior to learning that it has happened)
I(p) is monotonically decreasing in p;
I(p) is a continuous function of p for 0 ≤ p ≤ 1;
I(p1 × p2) = I(p1) + I(p2);
I(p) = −c · logb(p)
Entropy
Properties of Entropy
Joint Entropy
Conditional Entropy
Mutual Information
Properties
Conditional mutual information
Multiple R.V.
Properties of Entropy and Mutual information for M.R.V.
Lossless Data Compression
Principles
Data Processing inequality
Fano's Inequality
Divergence and Variational Distance
Convexity/Concavity of Information Measures
Fundamentals of Hypothesis Testing
Renyi's Information Measures