Statistics: Difference between revisions

From David's Wiki
Line 4: Line 4:
===Maximum Likelihood Estimator===
===Maximum Likelihood Estimator===
(MLE)
(MLE)
===Uniformly Minimum Variance Unbiased Estimator===
===Uniformly Minimum Variance Unbiased Estimator (UMVUE)===
UMVUE, sometimes called MVUE or UMVU.
UMVUE, sometimes called MVUE or UMVU.<br>
See [[Wikipedia: Lehmann-Scheffe Theorem]]<br>
An unbiased estimator of a complete-sufficient statistics is a UMVUE.<br>
In general, you should find a complete sufficient statistic using the property of exponential families.<br>
Then make it unbiased with some factors to get the UMVUE.<br>
 
==Tests==
==Tests==
===Basic Tests===
===Basic Tests===

Revision as of 19:10, 17 December 2019

Statistics

Estimation

Maximum Likelihood Estimator

(MLE)

Uniformly Minimum Variance Unbiased Estimator (UMVUE)

UMVUE, sometimes called MVUE or UMVU.
See Wikipedia: Lehmann-Scheffe Theorem
An unbiased estimator of a complete-sufficient statistics is a UMVUE.
In general, you should find a complete sufficient statistic using the property of exponential families.
Then make it unbiased with some factors to get the UMVUE.

Tests

Basic Tests

T-test

Used to test the mean.

F-test

Use to test the ratio of variances.

Likelihood Ratio Test

See Wikipedia: Likelihood Ratio Test

  • \(\displaystyle LR = -2 \log \frac{\sup_{\theta \in \Theta_0} L(\theta)}{\sup_{\theta \in \Theta} L(\theta)}\)

Uniformly Most Powerful Test

UMP Test
See Wikipedia: Neyman-Pearson Lemma

  • \(\displaystyle R_{NP} = \left\{x : \frac{L(\theta_0 | x)}{L(\theta_1 | x)} \leq \eta\right\}\)

Anova

Confidence Sets

Confidence Intervals

Relationship with Tests

Regression

Quadratic Forms

Bootstrapping

Wikipedia
Boostrapping is used to sample from your sample to get a measure of accuracy of your statistics.

Nonparametric Bootstrapping

In nonparametric bootstrapping, you resample from your sample with replacement.
In this scenario, you don't need to know the family of distributions that your sample comes from.

Parametric Bootstrapping

In parametric bootstrapping, you learn the distribution parameters of your sample, e.g. with MLE.
Then you can generate samples from that distribution on a computer.

Textbooks