Probability: Difference between revisions

No edit summary
Line 12: Line 12:
* For all events <math>A</math> and <math>B</math>, <math>A \subset B \implies P(A) \leq P(B)</math>
* For all events <math>A</math> and <math>B</math>, <math>A \subset B \implies P(A) \leq P(B)</math>
{{hidden | Proof | }}
{{hidden | Proof | }}
===Conditional Probability===
<math>P(A|B)</math> is the probability of event A given event B.<br>
Mathematically, this is defined as <math>P(A|B) = P(A,B) / P(B)</math>.<br>
Note that this can also be written as <math>P(A|B)P(B) = P(A, B)</math>
With some additional substitution, we get '''Baye's Theorem''':
<math>
P(A|B) = \frac{P(B|A)P(A)}{P(B)}
</math>
==Random Variables==
A random variable is a variable which takes on a distribution rather than a value.


===PMF, PDF, CDF===
===PMF, PDF, CDF===
Line 18: Line 30:
The comulative distribution function (CDF) is <math>F(x) = P(X \leq x)</math>.<br>
The comulative distribution function (CDF) is <math>F(x) = P(X \leq x)</math>.<br>
The CDF is the prefix sum of the PMF or the integral of the PDF. Likewise, the PDF is the derivative of the CDF.
The CDF is the prefix sum of the PMF or the integral of the PDF. Likewise, the PDF is the derivative of the CDF.
===Joint Random Variables===
Two random variables are independant if <math>f_{X,Y}(x,y) = f_X(x) f_Y(y)</math>.
Otherwise, the marginal distribution is <math>f_X(x) = \int f_{X,Y}(x,y) dy</math>.
===Conditional Random Variables===
...
===Change of variables===
Let <math>g</math> be a monotonic increasing function and <math>Y = g(X)</math>.
Then <math>F_Y(y) = P(Y \leq y) = P(X \leq g^{-1}(y)) = F_X(g^{-1}(y))</math>.
And <math>f_Y(y) = (d/dy)F_Y(y) = (d/dy) F_X(g^{-1}(y)) = f_X(g^{-1}(y)) (d/dy)g^{-1}(y)</math>


==Expectation and Variance==
==Expectation and Variance==