Function mish
pub fn mish<const D: usize, B>(tensor: Tensor<B, D>) -> Tensor<B, D>where
B: Backend,
Expand description
Applies the Mish function as described in the paper in Mish: A Self Regularized Non-Monotonic Neural Activation Function.
$$ \text{Mish}(x) = x * \tanh(\text{Softplus}(x)) = \tanh\left(\log(1 + \exp(x))\right) $$