Machine Learning Glossary: Difference between revisions
No edit summary |
No edit summary |
||
Line 6: | Line 6: | ||
==D== | ==D== | ||
* Dilation - how spread out a CNN kernel is. See [[Convolutional neural network]]. | * Dilation - how spread out a CNN kernel is. See [[Convolutional neural network]]. | ||
* Domain Adaptation - An area of research focused on making neural network work with alternate domains, or sources of data. | |||
==F== | ==F== | ||
Line 11: | Line 12: | ||
==G== | ==G== | ||
* [[Generative adversarial network]] or GAN - A neural network setup for generating examples from a training distribution. | |||
* Graph Neural Network | * Graph Neural Network | ||
==I== | ==I== | ||
* Intersection over Union - A metric for computing the accuracy of bounding box prediction. | * Intersection over Union - A metric for computing the accuracy of bounding box prediction. | ||
==L== | |||
* [[Long short-term memory]] or LSTM - An RNN neural network architecture which has two sets of hidden states for long and short term. | |||
==M== | ==M== | ||
Line 21: | Line 26: | ||
==N== | ==N== | ||
* Normalized Device Coordinates - In images, pixels are in coordinates of <math>[-1, 1]\times[-1, 1] </math>. | * Normalized Device Coordinates - In images, pixels are in coordinates of <math>[-1, 1]\times[-1, 1] </math>. | ||
==R== | |||
* Recurrent neural network (RNN) - A type of neural network which operates sequentially on sequence data. | |||
==S== | ==S== | ||
Line 26: | Line 34: | ||
==T== | ==T== | ||
* [[Transfer Learning]] - Techniques to make a neural network perform a different task than what it is trained on. | |||
* [[Transformer (machine learning model)]] - A neural network architecture for sequence data. | * [[Transformer (machine learning model)]] - A neural network architecture for sequence data. |
Revision as of 00:21, 24 June 2021
Machine Learning, Computer Vision, and Computer Graphics Glossary
C
- Convolutional neural network or CNN - A neural network architecture for image data, or other data on a regular grid.
D
- Dilation - how spread out a CNN kernel is. See Convolutional neural network.
- Domain Adaptation - An area of research focused on making neural network work with alternate domains, or sources of data.
F
- Fully connected network - The standard neural network model where each layer is a sequence of nodes.
G
- Generative adversarial network or GAN - A neural network setup for generating examples from a training distribution.
- Graph Neural Network
I
- Intersection over Union - A metric for computing the accuracy of bounding box prediction.
L
- Long short-term memory or LSTM - An RNN neural network architecture which has two sets of hidden states for long and short term.
M
- Multilayer perceptron - See Fully connected network.
N
- Normalized Device Coordinates - In images, pixels are in coordinates of \(\displaystyle [-1, 1]\times[-1, 1] \).
R
- Recurrent neural network (RNN) - A type of neural network which operates sequentially on sequence data.
S
- Stride - how far the CNN kernel in terms of input pixels moves between output pixels.
T
- Transfer Learning - Techniques to make a neural network perform a different task than what it is trained on.
- Transformer (machine learning model) - A neural network architecture for sequence data.