LeakyReLU
class LeakyReLU<T : DType, V>(val negativeSlope: Float = 0.01f, val name: String = "LeakyReLU") : Module<T, V> (source)
Leaky ReLU activation function.
LeakyReLU(x) = max(0, x) + negativeSlope * min(0, x) = x if x >= 0 = negativeSlope * x if x < 0
Unlike standard ReLU which completely zeros out negative values, LeakyReLU allows a small gradient to flow through for negative inputs, which can help prevent "dying ReLU" problems during training.
Parameters
negativeSlope
The slope for negative values (default: 0.01)
name
Name of the module