In mathematical statistics and information theory, the Fisher information (sometimes simply called information) can be defined as the variance of the score, or as the expected value of the observed information. In Bayesian statistics, the asymptotic distribution of the posterior mode depends on the Fisher information and not on the prior (according to the Bernstein–von Mises theorem, which was anticipated by Laplace for exponential families). The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician R.A. Fisher (following some initial results by F. Y. Edgeworth). The Fisher information is also used in the calculation of the Jeffreys prior, which is used in Bayesian statistics.
The Fisher-information matrix is used to calculate the covariance matrices associated with maximum-likelihood estimates. It can also be used in the formulation of test statistics, such as the Wald test.
Read more about Fisher Information: History, Definition, Matrix Form, Distinction From The Hessian of The Entropy
Famous quotes containing the words fisher and/or information:
“... no other railroad station in the world manages so mysteriously to cloak with compassion the anguish of departure and the dubious ecstasies of return and arrival. Any waiting room in the world is filled with all this, and I have sat in many of them and accepted it, and I know from deliberate acquaintance that the whole human experience is more bearable at the Gare de Lyon in Paris than anywhere else.”
—M.F.K. Fisher (19081992)
“Many more children observe attitudes, values and ways different from or in conflict with those of their families, social networks, and institutions. Yet todays young people are no more mature or capable of handling the increased conflicting and often stimulating information they receive than were young people of the past, who received the information and had more adult control of and advice about the information they did receive.”
—James P. Comer (20th century)