Machine Learning Glossary: Difference between revisions

From David's Wiki
Line 16: Line 16:


==G==
==G==
* [[Generative adversarial network]] or GAN - A neural network setup for generating examples from a training distribution.
* [[Generative adversarial network]] (GAN) - A neural network setup for generating examples from a training distribution.
* [[Graph Neural Network]]
* [[Graph neural network]] (GNN) - A type of neural network which operates on graph inputs.


==I==
==I==

Revision as of 21:32, 10 June 2022

Machine Learning, Computer Vision, and Computer Graphics Glossary

C

D

  • Dilation - how spread out a CNN kernel is. See Convolutional neural network.
  • Domain Adaptation - An area of research focused on making neural network work with alternate domains, or sources of data.

E

  • Early stopping - a technique where you stop training once the validation loss begins increasing. This is not used very often anymore with large models.

F

  • Fully connected network - The standard neural network model where each layer is a sequence of nodes.

G

I

  • Intersection over Union - A metric for computing the accuracy of bounding box prediction.

L

  • Long short-term memory (LSTM) - An RNN neural network architecture which has two sets of hidden states for long and short term.

M

  • Multilayer perceptron (MLP) - See Fully connected network.

N

  • Normalized Device Coordinates - In images, pixels are in coordinates of \(\displaystyle [-1, 1]\times[-1, 1] \).

O

  • Overfitting - when a model begins to learn noise specific to your training data, thereby worsening performance on non-training data.

R

  • Recurrent neural network (RNN) - A type of neural network which operates sequentially on sequence data.
  • Reinforcement Learning - an area of machine learning focused on learning to perform actions, E.g. playing a game

S

  • Stride - how far the CNN kernel in terms of input pixels moves between output pixels.

T

U

  • Underfitting - when a model performs poorly on both training and validation data, usually due to inadequate model complexity or training duration.