Probability

From David's Wiki
Jump to navigation Jump to search

Calculus-based Probability

Basics

Axioms of Probability

  • where is your sample space
  • For mutually exclusive events ,

Monotonicity

  • For all events , ,
Proof

Expectation and Variance

Some definitions and properties.

Definitions

Let for some distribution . Let be the support or domain of your distribution.

  • or

Total Expection


Dr. Xu refers to this as the smooth property.

Proof

Total Variance


This one is not used as often on tests as total expectation

Proof


Moments and Moment Generating Functions

Definitions

We call the i'th moment of .
We call the i'th central moment of .
Therefore the mean is the first moment and the variance is the second central moment.

Moment Generating Functions


We call this the moment generating function (mgf).
We can differentiate it with respect to and set to get the higher moments.

Notes
  • The mgf, if it exists, uniquely defines the distribution.
  • The mgf of is

Characteristic function

Convergence

There are 4 types of convergence typically taught in undergraduate courses.
See Wikipedia Convergence of random variables

Almost Surely

In Probability

For all

  • Implies Convergence in distribution

In Distribution

Pointwise convergence of the cdf
A sequence of random variables converges to in probability if for all ,

  • Equivalent to convergence in probability if it converges to a degenerate distribution (i.e. a number)

In Mean Squared

Delta Method

See Wikipedia
Suppose .
Let be a function such that exists and
Then
Multivariate:

Notes
  • You can think of this like the Mean Value theorem for random variables.

Order Statistics

Inequalities and Limit Theorems

Markov's Inequality

Let be a non-negative random variable.
Then

Proof

Chebyshev's Inequality

Proof

Apply Markov's inequality:
Let

  • Usually used to prove convergence in probability

Central Limit Theorem

Very very important. Never forget this.
For any distribution, the sample mean converges in distribution to normal.
Let and
Different ways of saying the same thing:

Law of Large Numbers

The sample mean converges to the population mean in probability.
For all ,

Notes
  • The sample mean converges to the population mean almost surely.

Relationships between distributions

This is important for tests.
See Relationships among probability distributions.

Poisson Distributions

Sum of poission is poisson sum of lambda.

Normal Distributions

  • If and then for any

Gamma Distributions

Note exponential distributions are also Gamma distrubitions

  • If then .
  • If and then .
  • If and , then .

T-distribution

Ratio of normal and squared-root of Chi-sq distribution yields T-distribution.

Chi-Sq Distribution

The ratio of two normalized Chi-sq is an F-distributions

F Distribution

Too many. See the Wikipedia Page. Most important are Chi-sq and T distribution

Textbooks