Function leaky_relu
pub fn leaky_relu<const D: usize, B>(
tensor: Tensor<B, D>,
negative_slope: f64,
) -> Tensor<B, D>where
B: Backend,
Expand description
Applies the leaky rectified linear unit function element-wise.
$$ \text{LeakyReLU}(x) = \max(0,x) + \text{negative\_slope} \times \min(0, x) $$
or
$$ \text{LeakyReLU}(x) = \begin{cases} x & \text{if } x \geq 0 \newline \text{negative\_slope} \times x & \text{otherwise} \end{cases} $$