Unsupervised Learning: Difference between revisions

Line 82: Line 82:
We will fix <math>\theta</math> and optimize wrt <math>Q</math>.<br>
We will fix <math>\theta</math> and optimize wrt <math>Q</math>.<br>
Jensen's inequality holds with equality iff either the function is linear or if the random variable is degenerate.<br>
Jensen's inequality holds with equality iff either the function is linear or if the random variable is degenerate.<br>
Since log is not linear, we will assume <math>\frac{P(x^{(i)}, z^{(i)}=j; \theta)
We will assume <math>\frac{P(x^{(i)}, z^{(i)}=j; \theta)
}{Q^{(i)}_{(j)}}</math> is a constant.<br>
}{Q^{(i)}_{(j)}}</math> is a constant.<br>
This implies <math>Q^{(i)}(j) = c * P(x^{(i)}, z^{(i)} = j ; \theta)</math>.<br>
This implies <math>Q^{(i)}(j) = c * P(x^{(i)}, z^{(i)} = j ; \theta)</math>.<br>