Activation Functions
List of common activation functions
Sinusoidal
Tanh
One of the older activation functions
Sigmoid
Cosine/Sine
See SIREN
ReLU
https://pytorch.org/docs/stable/generated/torch.nn.ReLU.html
ReLU is one of the most popular activation functions. It simply performs \(\displaystyle ReLU(x) = max(x, 0) = (x \gt 0) * x\)
LeakyReLU
https://pytorch.org/docs/stable/generated/torch.nn.LeakyReLU.html
\(\displaystyle LeakyReLU(x) = (x \gt 0) * x + (x \lt 0) * (-0.01x)\)
SiLU
GELU
https://pytorch.org/docs/stable/generated/torch.nn.GELU.html