5,337
edits
No edit summary |
No edit summary |
||
Line 6: | Line 6: | ||
==D== | ==D== | ||
* Dilation - how spread out a CNN kernel is. See [[Convolutional neural network]]. | * Dilation - how spread out a CNN kernel is. See [[Convolutional neural network]]. | ||
* Domain Adaptation - An area of research focused on making neural network work with alternate domains, or sources of data. | |||
==F== | ==F== | ||
Line 11: | Line 12: | ||
==G== | ==G== | ||
* [[Generative adversarial network]] or GAN - A neural network setup for generating examples from a training distribution. | |||
* Graph Neural Network | * Graph Neural Network | ||
==I== | ==I== | ||
* Intersection over Union - A metric for computing the accuracy of bounding box prediction. | * Intersection over Union - A metric for computing the accuracy of bounding box prediction. | ||
==L== | |||
* [[Long short-term memory]] or LSTM - An RNN neural network architecture which has two sets of hidden states for long and short term. | |||
==M== | ==M== | ||
Line 21: | Line 26: | ||
==N== | ==N== | ||
* Normalized Device Coordinates - In images, pixels are in coordinates of <math>[-1, 1]\times[-1, 1] </math>. | * Normalized Device Coordinates - In images, pixels are in coordinates of <math>[-1, 1]\times[-1, 1] </math>. | ||
==R== | |||
* Recurrent neural network (RNN) - A type of neural network which operates sequentially on sequence data. | |||
==S== | ==S== | ||
Line 26: | Line 34: | ||
==T== | ==T== | ||
* [[Transfer Learning]] - Techniques to make a neural network perform a different task than what it is trained on. | |||
* [[Transformer (machine learning model)]] - A neural network architecture for sequence data. | * [[Transformer (machine learning model)]] - A neural network architecture for sequence data. |