Probability: Difference between revisions

From David's Wiki
Line 8: Line 8:
If <math>X_1 \sim N(\mu_1, \sigma_1^2)</math> and <math>X_2 \sim N(\mu_2, \sigma_2^2)</math> then <math>\lambda_1 X_1 + \lambda_2 X_2 \sim N(\lambda_1 \mu_1 + \lambda_2 X_2, \lambda_1^2 \sigma_1^2 + \lambda_2^2 + \sigma_2^2)</math> for any <math>\lambda_1, \lambda_2 \in \mathbb{R}</math>
If <math>X_1 \sim N(\mu_1, \sigma_1^2)</math> and <math>X_2 \sim N(\mu_2, \sigma_2^2)</math> then <math>\lambda_1 X_1 + \lambda_2 X_2 \sim N(\lambda_1 \mu_1 + \lambda_2 X_2, \lambda_1^2 \sigma_1^2 + \lambda_2^2 + \sigma_2^2)</math> for any <math>\lambda_1, \lambda_2 \in \mathbb{R}</math>


===Gamma + Gamma===
===Gamma Distributions===
Note exponential distributions are also Gamma distrubitions
Note exponential distributions are also Gamma distrubitions
===Gamma and Beta===
===Gamma and Beta===
If <math>X_1 \sim \Gamma(\alpha, \theta)</math> and <math>X_2 \sim \Gamma(\beta, \theta)</math>, then <math>\frac{X_1}{X_1 + X_2} \sim B(\alpha, \beta)</math>
If <math>X_1 \sim \Gamma(\alpha, \theta)</math> and <math>X_2 \sim \Gamma(\beta, \theta)</math>, then <math>\frac{X_1}{X_1 + X_2} \sim B(\alpha, \beta)</math>

Revision as of 03:04, 5 November 2019

Introductory Probability as taught in Sheldon Ross' book


Common Distributions

This is important for tests.
See Relationships among probability distributions.

Normal + Normal

If \(\displaystyle X_1 \sim N(\mu_1, \sigma_1^2)\) and \(\displaystyle X_2 \sim N(\mu_2, \sigma_2^2)\) then \(\displaystyle \lambda_1 X_1 + \lambda_2 X_2 \sim N(\lambda_1 \mu_1 + \lambda_2 X_2, \lambda_1^2 \sigma_1^2 + \lambda_2^2 + \sigma_2^2)\) for any \(\displaystyle \lambda_1, \lambda_2 \in \mathbb{R}\)

Gamma Distributions

Note exponential distributions are also Gamma distrubitions

Gamma and Beta

If \(\displaystyle X_1 \sim \Gamma(\alpha, \theta)\) and \(\displaystyle X_2 \sim \Gamma(\beta, \theta)\), then \(\displaystyle \frac{X_1}{X_1 + X_2} \sim B(\alpha, \beta)\)