Information Theory - Quantities of Information

Quantities of Information

Information theory is based on probability theory and statistics. The most important quantities of information are entropy, the information in a random variable, and mutual information, the amount of information in common between two random variables. The former quantity indicates how easily message data can be compressed while the latter can be used to find the communication rate across a channel.

The choice of logarithmic base in the following formulae determines the unit of information entropy that is used. The most common unit of information is the bit, based on the binary logarithm. Other units include the nat, which is based on the natural logarithm, and the hartley, which is based on the common logarithm.

In what follows, an expression of the form is considered by convention to be equal to zero whenever This is justified because for any logarithmic base.

Read more about this topic:  Information Theory

Famous quotes containing the words quantities of, quantities and/or information:

    The Walrus and the Carpenter
    Were walking close at hand:
    They wept like anything to see
    Such quantities of sand:
    “If this were only cleared away,”
    They said, “it would be grand!”
    “If seven maids with seven mops
    Swept it for half a year,
    Do you suppose,” the Walrus said,
    “That they could get it clear?”
    “I doubt it,” said the Carpenter,
    And shed a bitter tear.
    Lewis Carroll [Charles Lutwidge Dodgson] (1832–1898)

    Compilers resemble gluttonous eaters who devour excessive quantities of healthy food just to excrete them as refuse.
    Franz Grillparzer (1791–1872)

    Rejecting all organs of information ... but my senses, I rid myself of the Pyrrhonisms with which an indulgence in speculations hyperphysical and antiphysical so uselessly occupy and disquiet the mind.
    Thomas Jefferson (1743–1826)