ELU
Exponential Linear Unit (ELU) activation function.
ELU(x) = x if x >= 0 = alpha * (exp(x) - 1) if x < 0
ELU has negative values which pushes the mean of activations closer to zero, which can help speed up learning and lead to higher accuracy. Unlike LeakyReLU, ELU saturates for large negative values, making it more robust to noise.
Reference: "Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)" https://arxiv.org/abs/1511.07289
Parameters
alpha
The scale for the negative region (default: 1.0)
name
Name of the module