Activation Functions: Difference between revisions
Created page with "List of common activation functions ==Sinusoidal== ===Tanh=== One of the older activation functions ===Sigmoid=== ===Cosine/Sine=== See [https://proceedings.neurips.cc/paper/2020/hash/53c04118df112c13a8c34b38343b9c10-Abstract.html SIREN] ==ReLU== https://pytorch.org/docs/stable/generated/torch.nn.ReLU.html ReLU is one of the most popular activation functions. It simply performs <math>ReLU(x) = max(x, 0) = (x > 0) * x</math> ===LeakyReLU=== https://pytorch.org/docs..." |
|||
Line 22: | Line 22: | ||
===SiLU=== | ===SiLU=== | ||
=== | ===GELU=== | ||
https://pytorch.org/docs/stable/generated/torch.nn.GELU.html | |||
===GeGLU=== | ===GeGLU=== | ||
==Resources== | ==Resources== | ||
* [https://armandolivares.tech/2022/09/04/elu-gelu-and-silu-activation-functions/] | * [https://armandolivares.tech/2022/09/04/elu-gelu-and-silu-activation-functions/] |