5,337
edits
No edit summary |
|||
Line 1,817: | Line 1,817: | ||
# Case 1: H is a discrete variable then <math>g^*(x, z) = \eta^*(x)^t \psi(z^*)</math> | # Case 1: H is a discrete variable then <math>g^*(x, z) = \eta^*(x)^t \psi(z^*)</math> | ||
# Case 2: There exists <math>\eta,\psi</math> such that <math>E(\eta(x)^t \psi(z) - g^*(x,z))^2 \leq o(\frac{1}{m})</math>. | # Case 2: There exists <math>\eta,\psi</math> such that <math>E(\eta(x)^t \psi(z) - g^*(x,z))^2 \leq o(\frac{1}{m})</math>. | ||
==Meta Learning== | ==Meta Learning== | ||
Line 1,877: | Line 1,872: | ||
4: <math>A</math> is a black box (e.g. LSTM). | 4: <math>A</math> is a black box (e.g. LSTM). | ||
==LSTMs and transformers== | |||
Lecture 23 (November 19, 2020) | |||
===Recurrent Neural Networks (RNNs)=== | |||
==Misc== | ==Misc== |