# Changes

,  08:49, 4 December 2019
* KL is always >= 0
* KL is not symmetric
* Jensen-Shannon Divergence
** $JSD(P \Vert Q) = \frac{1}{2}KL(P \Vert Q) + \frac{1}{2}KL(Q \Vert P)$
** This is symmetric
====Model====
The main idea is to ensure the that discriminator is lipschitz continuous and to limit the lipschitz constant (i.e. the derivative) of the discriminator.<br>
If the correct answer is 1.0 and the generator produces 1.0001, we don't want the discriminator to give us a very high loss.<br>
====Earth mover's distance====
{{main | wikipedia:earth mover's distance}}
The minimum cost of converting one pile of dirt to another.<br>
Where cost is the cost of moving (amount * distance)<br>
Given a set $P$ with m clusters and a set $Q$ with n clusters:<br>
...
$EMD(P, Q) = \frac{\sum_{i=1}^{m}\sum_{j=1}^{n}f_{i,j}d_{i,j}}{\sum_{i=1}^{m}\sum_{j=1}^{n}f_{i,j}}$<br>
;Notes
* Also known as Wasserstein metric

==Dimension Reduction==
Goal: Reduce the dimension of a dataset.<br>
If each example $x \in \mathbb{R}^n$, we want to reduce each example to be in $\mathbb{R}^r$ where $r < n$
===PCA===
Principal Component Analysis<br>
Preprocessing: Subtract the sample mean from each example so that the new sample mean is 0.<br>
Goal: Find a vector $v_1$ such that the projection $v_1 \cdot x$ has maximum variance.<br>
These principal components are the eigenvectors of $X^TX$.<br>

===Kernel PCA===
===Autoencoder===
2,310

edits