Béda révisi "Informasi Fisher"

1 bita dipupus ,  16 tahun yang lalu
taya kamandang éditan
InDina [[statisticsstatistik]], the '''Fisher information''' ''I''(θ), thought of as the amount of [[information]] that an observable [[random variable]] carries about an unobservable parameter θ upon which the [[probability distribution]] of ''X'' depends, is the [[variance]] of the [[score (statistics)|score]]. Because the [[expectation]] of the score is zero, this may be written as
The [[Cram&eacute;r-Rao inequality]] states that the reciprocal of the Fisher information is a lower bound on the variance of any unbiased estimator of &theta;.
The information contained in ''n'' independent [[Bernoulli trial]]s, each with probability of success ''&theta;'' may be calculated as follows. In the following, ''a'' represents the number of successes, ''b'' the number of failures, and ''n=a+b'' is the total number of trials.