Entropic Uncertainty

In quantum mechanics, information theory, and Fourier analysis, the entropic uncertainty or Hirschman uncertainty is defined as the sum of the temporal and spectral Shannon entropies. It turns out that Heisenberg's uncertainty principle can be expressed as a lower bound on the sum of these entropies. This is stronger than the usual statement of the uncertainty principle in terms of the product of standard deviations.

In 1957, Hirschman considered a function f and its Fourier transform g such that

where the "" indicates convergence in, and normalized so that (by Plancherel's theorem)

He showed that for any such functions the sum of the Shannon entropies is non-negative:

A tighter bound,

was conjectured by Hirschman and Everett and proven in 1975 by W. Beckner. The equality holds in the case of Gaussian distributions.

Note, however, that the above entropic uncertainty function is distinctly different than the quantum Von Neumann entropy represented in phase space.

Read more about Entropic Uncertainty:  Sketch of Proof, Entropy Versus Variance Bounds

Famous quotes containing the word uncertainty:

    Now, since our condition accommodates things to itself, and transforms them according to itself, we no longer know things in their reality; for nothing comes to us that is not altered and falsified by our Senses. When the compass, the square, and the rule are untrue, all the calculations drawn from them, all the buildings erected by their measure, are of necessity also defective and out of plumb. The uncertainty of our senses renders uncertain everything that they produce.
    Michel de Montaigne (1533–1592)