Chain Rules for Entropy, Relative Entropy and Mutual Information
We now show that the entropy of a collection of random variables is the sum of the conditional entropies.
Note: 本文的证明还比较粗糙, 后面会改进.
Ref: Elements of Information Theory
Chain rule for entropy
Let
Proof: Short
By repeated application of the two-variable expansion rule for entropies, we have
Proof: Long
We write
Chain rule for Mutual Information
Mutual information also satisfies a chain rule.
Proof
Chain Rule for Relative Entropy
The chain rule for relative entropy is used in Section 4.4 to prove a version of the second law of thermodynamics.