In information theory, the conditional entropy (or equivocation) quantifies the amount of information needed to describe the outcome of a random variable given that the value of another random variable is known. Here, information is measured in bits, nats, or bans. The entropy of conditioned on is written as .
Read more about Conditional Entropy: Definition, Chain Rule, Generalization To Quantum Theory, Other Properties
Famous quotes containing the words conditional and/or entropy:
“Computer mediation seems to bathe action in a more conditional light: perhaps it happened; perhaps it didnt. Without the layered richness of direct sensory engagement, the symbolic medium seems thin, flat, and fragile.”
—Shoshana Zuboff (b. 1951)
“Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.”
—Václav Havel (b. 1936)