ELU

class ELU<T : DType, V>(val alpha: Float = 1.0f, val name: String = "ELU") : Module<T, V> (source)

Exponential Linear Unit (ELU) activation function.

ELU(x) = x if x >= 0 = alpha * (exp(x) - 1) if x < 0

ELU has negative values which pushes the mean of activations closer to zero, which can help speed up learning and lead to higher accuracy. Unlike LeakyReLU, ELU saturates for large negative values, making it more robust to noise.

Reference: "Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)" https://arxiv.org/abs/1511.07289

Parameters

alpha

The scale for the negative region (default: 1.0)

name

Name of the module

Constructors

Link copied to clipboard
constructor(alpha: Float = 1.0f, name: String = "ELU")

Properties

Link copied to clipboard
Link copied to clipboard
open override val modules: List<Module<T, V>>
Link copied to clipboard
open override val name: String

Functions

Link copied to clipboard
open override fun forward(input: Tensor<T, V>, ctx: ExecutionContext): Tensor<T, V>