Please enable JavaScript.
Coggle requires JavaScript to display documents.
Information theory basics - Coggle Diagram
Information theory basics
Shannon information content
Consider a discrete random variable X
Shannon information content for outcome X = x:
Desiderata in measuring information
Deterministic outcomes contain no information
Information content increases with decreasing probability
Information content is additive for independent random variables
Shannon entropy
Average information content:
Joint entropy
multivariate generalization of Shannon entropy
Conditional entropy
Properties
Relative entropy
Kullback-Leiber divergence
A useful measure of difference between two distributions
Take P and Q to be PMFs of two distributions
Properties of KL Divergence
Not a metric
Mutual information
general measure of dependence
Cross-Entropy
p = true distribution
q = predicted distribution
Cross-Entropy as a Cost Function