5,323
edits
Line 23: | Line 23: | ||
In general, you should find a complete sufficient statistic using the property of exponential families.<br> | In general, you should find a complete sufficient statistic using the property of exponential families.<br> | ||
Then make it unbiased with some factors to get the UMVUE.<br> | Then make it unbiased with some factors to get the UMVUE.<br> | ||
===Efficiency=== | |||
====Fisher Information==== | |||
{{main | Wikipedia: Fisher Information}} | |||
* <math>I(\theta) = E[ (\frac{\partial}{\partial \theta} \log f(X; \theta) )^2 | \theta]</math> | |||
* or if <math>\log f(x)</math> is twice differentiable <math>I(\theta) = -E[ \frac{\partial^2}{\partial \theta^2} \log f(X; \theta) | \theta]</math> | |||
====Cramer-Rao Lower Bound==== | |||
{{main | Wikipedia: Cramer-Rao Bound}} | |||
Given an estimator <math>T(X)</math>, let <math>\psi(\theta)=E[T(X)]</math>. | |||
Then <math>Var(T) \geq \frac{(\psi'(\theta))^2}{I(\theta)}</math> | |||
;Notes | |||
* If <math>T(X)</math> is unbiased then <math>\psi(\theta)=\theta \implies \psi'(\theta) = 1</math> | |||
: Our lower bound will be <math>\frac{1}{I(\theta)}</math> | |||
The efficiency of an unbiased estimator is defined as <math>e(T) = \frac{I(\theta)^{-1}}{Var(T)}</math> | |||
===Sufficient Statistics=== | |||
====Auxiliary Statistics==== | |||
==Tests== | ==Tests== |