5,321
edits
(One intermediate revision by the same user not shown) | |||
Line 194: | Line 194: | ||
* Generate latent variables <math>z^{(1)},...,z^{(m)} \in \mathbb{R}^r</math> iid where dimension r is less than n. | * Generate latent variables <math>z^{(1)},...,z^{(m)} \in \mathbb{R}^r</math> iid where dimension r is less than n. | ||
** We assume <math>Z^{(i)} \sim N(\mathbf{0},\mathbf{I})</math> | ** We assume <math>Z^{(i)} \sim N(\mathbf{0},\mathbf{I})</math> | ||
* Generate <math>x^{(i)}</math> where <math>X^{(i)} \vert Z^{(i)} \ | * Generate <math>x^{(i)}</math> where <math>X^{(i)} \vert Z^{(i)} \sim N(g_{\theta}(z), \sigma^2 \mathbf{I})</math> | ||
** For some function <math>g_{\theta_1}</math> parameterized by <math>\theta_1</math> | ** For some function <math>g_{\theta_1}</math> parameterized by <math>\theta_1</math> | ||
Line 248: | Line 248: | ||
===Kernel PCA=== | ===Kernel PCA=== | ||
{{main | Wikipedia: Kernel principal component analysis}} | |||
===Autoencoder=== | ===Autoencoder=== | ||
You have a encoder and a decoder which are both neural networks. | You have a encoder and a decoder which are both neural networks. |