Machine Learning Glossary: Difference between revisions

From David's Wiki
No edit summary
No edit summary
Line 2: Line 2:


==C==
==C==
* [[Capsule neural network]]
* [[Convolutional neural network]] or CNN - A neural network architecture for image data, or other data on a regular grid.
* [[Convolutional neural network]] or CNN - A neural network architecture for image data, or other data on a regular grid.


Line 7: Line 8:
* Dilation - how spread out a CNN kernel is. See [[Convolutional neural network]].
* Dilation - how spread out a CNN kernel is. See [[Convolutional neural network]].
* Domain Adaptation - An area of research focused on making neural network work with alternate domains, or sources of data.
* Domain Adaptation - An area of research focused on making neural network work with alternate domains, or sources of data.
==E==
* Early stopping - a technique where you stop training once the validation loss begins increasing. This is not used very often anymore with large models.


==F==
==F==
Line 26: Line 30:
==N==
==N==
* Normalized Device Coordinates - In images, pixels are in coordinates of <math>[-1, 1]\times[-1, 1] </math>.
* Normalized Device Coordinates - In images, pixels are in coordinates of <math>[-1, 1]\times[-1, 1] </math>.
==O==
* Overfitting - when a model begins to learn noise specific to your training data, thereby worsening performance on non-training data.


==R==
==R==
* Recurrent neural network (RNN) - A type of neural network which operates sequentially on sequence data.
* Recurrent neural network (RNN) - A type of neural network which operates sequentially on sequence data.
* [[Reinforcement learning]] - an area of machine learning focused on learning to perform actions, E.g. playing a game


==S==
==S==
Line 35: Line 43:
==T==
==T==
* [[Transfer Learning]] - Techniques to make a neural network perform a different task than what it is trained on.
* [[Transfer Learning]] - Techniques to make a neural network perform a different task than what it is trained on.
* [[Transformer (machine learning model)]] - A neural network architecture for sequence data.
* [[Transformer (machine learning model)]] - A neural network architecture for sequence data which uses attention between elements of the sequence.
 
==U==
* Underfitting - when a model performs poorly on both training and validation data, usually due to inadequate model complexity or training duration.

Revision as of 20:08, 23 September 2021

Machine Learning, Computer Vision, and Computer Graphics Glossary

C

D

  • Dilation - how spread out a CNN kernel is. See Convolutional neural network.
  • Domain Adaptation - An area of research focused on making neural network work with alternate domains, or sources of data.

E

  • Early stopping - a technique where you stop training once the validation loss begins increasing. This is not used very often anymore with large models.

F

  • Fully connected network - The standard neural network model where each layer is a sequence of nodes.

G

I

  • Intersection over Union - A metric for computing the accuracy of bounding box prediction.

L

  • Long short-term memory or LSTM - An RNN neural network architecture which has two sets of hidden states for long and short term.

M

  • Multilayer perceptron - See Fully connected network.

N

  • Normalized Device Coordinates - In images, pixels are in coordinates of \(\displaystyle [-1, 1]\times[-1, 1] \).

O

  • Overfitting - when a model begins to learn noise specific to your training data, thereby worsening performance on non-training data.

R

  • Recurrent neural network (RNN) - A type of neural network which operates sequentially on sequence data.
  • Reinforcement learning - an area of machine learning focused on learning to perform actions, E.g. playing a game

S

  • Stride - how far the CNN kernel in terms of input pixels moves between output pixels.

T

U

  • Underfitting - when a model performs poorly on both training and validation data, usually due to inadequate model complexity or training duration.