Activation Functions

From David's Wiki
Revision as of 15:06, 15 March 2024 by David (talk | contribs) (Created page with "List of common activation functions ==Sinusoidal== ===Tanh=== One of the older activation functions ===Sigmoid=== ===Cosine/Sine=== See [https://proceedings.neurips.cc/paper/2020/hash/53c04118df112c13a8c34b38343b9c10-Abstract.html SIREN] ==ReLU== https://pytorch.org/docs/stable/generated/torch.nn.ReLU.html ReLU is one of the most popular activation functions. It simply performs <math>ReLU(x) = max(x, 0) = (x > 0) * x</math> ===LeakyReLU=== https://pytorch.org/docs...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
\( \newcommand{\P}[]{\unicode{xB6}} \newcommand{\AA}[]{\unicode{x212B}} \newcommand{\empty}[]{\emptyset} \newcommand{\O}[]{\emptyset} \newcommand{\Alpha}[]{Α} \newcommand{\Beta}[]{Β} \newcommand{\Epsilon}[]{Ε} \newcommand{\Iota}[]{Ι} \newcommand{\Kappa}[]{Κ} \newcommand{\Rho}[]{Ρ} \newcommand{\Tau}[]{Τ} \newcommand{\Zeta}[]{Ζ} \newcommand{\Mu}[]{\unicode{x039C}} \newcommand{\Chi}[]{Χ} \newcommand{\Eta}[]{\unicode{x0397}} \newcommand{\Nu}[]{\unicode{x039D}} \newcommand{\Omicron}[]{\unicode{x039F}} \DeclareMathOperator{\sgn}{sgn} \def\oiint{\mathop{\vcenter{\mathchoice{\huge\unicode{x222F}\,}{\unicode{x222F}}{\unicode{x222F}}{\unicode{x222F}}}\,}\nolimits} \def\oiiint{\mathop{\vcenter{\mathchoice{\huge\unicode{x2230}\,}{\unicode{x2230}}{\unicode{x2230}}{\unicode{x2230}}}\,}\nolimits} \)

List of common activation functions


Sinusoidal

Tanh

One of the older activation functions

Sigmoid

Cosine/Sine

See SIREN

ReLU

https://pytorch.org/docs/stable/generated/torch.nn.ReLU.html

ReLU is one of the most popular activation functions. It simply performs \(\displaystyle ReLU(x) = max(x, 0) = (x \gt 0) * x\)

LeakyReLU

https://pytorch.org/docs/stable/generated/torch.nn.LeakyReLU.html

\(\displaystyle LeakyReLU(x) = (x \gt 0) * x + (x \lt 0) * (-0.01x)\)

SiLU

GeLU

GeGLU

Resources