Function prelu

pub fn prelu<const D: usize, B>(
    tensor: Tensor<B, D>,
    alpha: Tensor<B, 1>,
) -> Tensor<B, D>
where B: Backend,
Expand description

Applies Parametric ReLu activation function as described in the paper Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification.

The tensor is assumed to be of shape [batch_size, channels, …]. alpha is assumed to be of shape [channels] or [1].

$$ \text{PReLU}(x) = \max(0,x) + \alpha * \min(0, x) $$

or

$$ \text{PReLU}(x) = \begin{cases} x & \text{if } x \geq 0 \newline \alpha x & \text{otherwise} \end{cases} $$