Joint Entropy - Definition

Definition

The joint entropy of two variables and is defined as

where and are particular values of and, respectively, is the probability of these values occurring together, and is defined to be 0 if .

For more than two variables this expands to

where are particular values of, respectively, is the probability of these values occurring together, and is defined to be 0 if .

Read more about this topic:  Joint Entropy

Famous quotes containing the word definition:

    Mothers often are too easily intimidated by their children’s negative reactions...When the child cries or is unhappy, the mother reads this as meaning that she is a failure. This is why it is so important for a mother to know...that the process of growing up involves by definition things that her child is not going to like. Her job is not to create a bed of roses, but to help him learn how to pick his way through the thorns.
    Elaine Heffner (20th century)

    It’s a rare parent who can see his or her child clearly and objectively. At a school board meeting I attended . . . the only definition of a gifted child on which everyone in the audience could agree was “mine.”
    Jane Adams (20th century)

    Scientific method is the way to truth, but it affords, even in
    principle, no unique definition of truth. Any so-called pragmatic
    definition of truth is doomed to failure equally.
    Willard Van Orman Quine (b. 1908)