Module activation
Expand description
The activation module.
Functions§
- gelu
- Applies the Gaussian Error Linear Units function as described in the paper Gaussian Error Linear Units (GELUs).
- hard_
sigmoid - Applies the hard sigmoid function element-wise.
- leaky_
relu - Applies the leaky rectified linear unit function element-wise.
- log_
sigmoid - Applies the log sigmoid function element-wise.
- log_
softmax - Applies the log softmax function on the input tensor along the given dimension.
- mish
- Applies the Mish function as described in the paper in Mish: A Self Regularized Non-Monotonic Neural Activation Function.
- prelu
- Applies Parametric ReLu activation function as described in the paper Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification.
- quiet_
softmax - Applies the “quiet softmax” function on the input tensor along the given dimension.
- relu
- Applies the rectified linear unit function element-wise as described in the paper Deep Learning using Rectified Linear Units (ReLU).
- sigmoid
- Applies the sigmoid function element-wise.
- silu
- Applies the SiLU function (also known as the swish function) element-wise.
- softmax
- Applies the softmax function on the input tensor along the given dimension.
- softmin
- Applies the softmin function on the input tensor along the given dimension.
- softplus
- Applies the SoftPlus function element-wise.
- tanh
- Applies the tanh function element-wise.