Definition
If is the entropy of the variable conditioned on the variable taking a certain value, then is the result of averaging over all possible values that may take.
Given discrete random variable with support and with support, the conditional entropy of given is defined as:
Note: The supports of X and Y can be replaced by their domains if it is understood that should be treated as being equal to zero.
if and only if the value of is completely determined by the value of . Conversely, if and only if and are independent random variables.
Read more about this topic: Conditional Entropy
Famous quotes containing the word definition:
“The very definition of the real becomes: that of which it is possible to give an equivalent reproduction.... The real is not only what can be reproduced, but that which is always already reproduced. The hyperreal.”
—Jean Baudrillard (b. 1929)
“The man who knows governments most completely is he who troubles himself least about a definition which shall give their essence. Enjoying an intimate acquaintance with all their particularities in turn, he would naturally regard an abstract conception in which these were unified as a thing more misleading than enlightening.”
—William James (18421910)
“The physicians say, they are not materialists; but they are:MSpirit is matter reduced to an extreme thinness: O so thin!But the definition of spiritual should be, that which is its own evidence. What notions do they attach to love! what to religion! One would not willingly pronounce these words in their hearing, and give them the occasion to profane them.”
—Ralph Waldo Emerson (18031882)