Chain Rule
Assume that the combined system determined by two random variables X and Y has entropy, that is, we need bits of information to describe its exact state. Now if we first learn the value of, we have gained bits of information. Once is known, we only need bits to describe the state of the whole system. This quantity is exactly, which gives the chain rule of conditional probability:
Formally, the chain rule indeed follows from the above definition of conditional probability:
Read more about this topic: Conditional Entropy
Famous quotes containing the words chain and/or rule:
“It is the future that creates his present.
All is an interminable chain of longing.”
—Robert Frost (18741963)
“The condition that gives birth to a rule is not the same as the condition to which the rule gives birth.”
—Friedrich Nietzsche (18441900)