From David's Wiki
Jump to navigation Jump to search

Calculus-based Probability


Axioms of Probability

  • where is your sample space
  • For mutually exclusive events ,


  • For all events , ,

Expectation and Variance

Some definitions and properties.


Let for some distribution . Let be the support or domain of your distribution.

  • or

Total Expection

Dr. Xu refers to this as the smooth property.


Total Variance

This one is not used as often on tests as total expectation


Sample Mean and Variance

The sample mean is .
The unbiased sample variance is .

Student's Theorem

Let be from .
Then the following results about the sample mean and the unbiased sample variance hold:

  • and are independent

Moments and Moment Generating Functions


We call the i'th moment of .
We call the i'th central moment of .
Therefore the mean is the first moment and the variance is the second central moment.

Moment Generating Functions

We call this the moment generating function (mgf).
We can differentiate it with respect to and set to get the higher moments.

  • The mgf, if it exists, uniquely defines the distribution.
  • The mgf of is

Characteristic function


There are 4 types of convergence typically taught in undergraduate courses.
See Wikipedia Convergence of random variables

Almost Surely

In Probability

For all

  • Implies Convergence in distribution

In Distribution

Pointwise convergence of the cdf
A sequence of random variables converges to in probability if for all ,

  • Equivalent to convergence in probability if it converges to a degenerate distribution (i.e. a number)

In Mean Squared

Delta Method

See Wikipedia
Suppose .
Let be a function such that exists and

  • You can think of this like the Mean Value theorem for random variables.

Order Statistics

Inequalities and Limit Theorems

Markov's Inequality

Let be a non-negative random variable.


Chebyshev's Inequality


Apply Markov's inequality:

  • Usually used to prove convergence in probability

Central Limit Theorem

Very very important. Never forget this.
For any distribution, the sample mean converges in distribution to normal.
Let and
Different ways of saying the same thing:

Law of Large Numbers

The sample mean converges to the population mean in probability.
For all ,

  • The sample mean converges to the population mean almost surely.

Properties and Relationships between distributions

This is important for exams.
See Relationships among probability distributions.

Poisson Distribution

  • If then

Normal Distribution

  • If and then for any

Exponential Distribution

  • is equivalent to
    • Note that some conventions flip the second parameter of gamma, so it would be
  • If are exponential distributions then
  • Note that the maximum is not exponentially distributed
    • However, if then

Gamma Distribution

Note exponential distributions are also Gamma distrubitions

  • If then .
  • If and then .
  • If and , then .


  • Ratio of standard normal and squared-root of Chi-sq distribution yields T-distribution.
    • If and then

Chi-Sq Distribution

  • The ratio of two normalized Chi-sq is an F-distributions
    • If and then
  • If then
  • If then

F Distribution

Too many. See the Wikipedia Page. Most important are Chi-sq and T distribution

  • If and then
  • If then and