Please enable JavaScript.
Coggle requires JavaScript to display documents.
Information Theory - Coggle Diagram
Information Theory
Measures
Entropy
-
-
-
-
H(X) depends only on the distribution of X but not on the actual values taken by X, hence also written H(pX)
-
-
Joint Entropy
H(X, Y) = -Σp(x,y)log(p(x,y)) = -E(log(p(X, Y)))
-
Mutual Information
-
-
I(X; Y) = H(X) + H(Y) - H(X, Y)
-
I(X; Y) = Σp(x,y)log(p(x,y)/p(x)p(y)) = Elog(p(X,Y)/p(X)p(Y))
-
Conditional Entropy
H(Y|X) = -ΣΣp(x,y)log(p(y|x)) = -E(log(p(Y|X)))
-
Probability
-
Marginal probability is the probability of an event irrespective of the outcome of another variable.
Conditional probability is the probability of one event occurring in the presence of a second event.
-
-
-