Function softplus

pub fn softplus<const D: usize, B>(
    tensor: Tensor<B, D>,
    beta: f64,
) -> Tensor<B, D>
where B: Backend,
Expand description

Applies the SoftPlus function element-wise.

$$ \text{softplus}(x) = \frac{1}{\beta}\log(1 + \exp(\beta x)) $$

The SoftPlus function is a smooth approximation of the ReLU function.