Activation Functions: Difference between revisions

From David's Wiki
(Created page with "List of common activation functions ==Sinusoidal== ===Tanh=== One of the older activation functions ===Sigmoid=== ===Cosine/Sine=== See [https://proceedings.neurips.cc/paper/2020/hash/53c04118df112c13a8c34b38343b9c10-Abstract.html SIREN] ==ReLU== https://pytorch.org/docs/stable/generated/torch.nn.ReLU.html ReLU is one of the most popular activation functions. It simply performs <math>ReLU(x) = max(x, 0) = (x > 0) * x</math> ===LeakyReLU=== https://pytorch.org/docs...")
 
 
Line 22: Line 22:


===SiLU===
===SiLU===
===GeLU===
===GELU===
https://pytorch.org/docs/stable/generated/torch.nn.GELU.html
 
===GeGLU===
===GeGLU===


==Resources==
==Resources==
* [https://armandolivares.tech/2022/09/04/elu-gelu-and-silu-activation-functions/]
* [https://armandolivares.tech/2022/09/04/elu-gelu-and-silu-activation-functions/]

Latest revision as of 15:06, 15 March 2024

List of common activation functions


Sinusoidal

Tanh

One of the older activation functions

Sigmoid

Cosine/Sine

See SIREN

ReLU

https://pytorch.org/docs/stable/generated/torch.nn.ReLU.html

ReLU is one of the most popular activation functions. It simply performs \(\displaystyle ReLU(x) = max(x, 0) = (x \gt 0) * x\)

LeakyReLU

https://pytorch.org/docs/stable/generated/torch.nn.LeakyReLU.html

\(\displaystyle LeakyReLU(x) = (x \gt 0) * x + (x \lt 0) * (-0.01x)\)

SiLU

GELU

https://pytorch.org/docs/stable/generated/torch.nn.GELU.html

GeGLU

Resources