LeakyReLU

class LeakyReLU<T : DType, V>(val negativeSlope: Float = 0.01f, val name: String = "LeakyReLU") : Module<T, V> (source)

Leaky ReLU activation function.

LeakyReLU(x) = max(0, x) + negativeSlope * min(0, x) = x if x >= 0 = negativeSlope * x if x < 0

Unlike standard ReLU which completely zeros out negative values, LeakyReLU allows a small gradient to flow through for negative inputs, which can help prevent "dying ReLU" problems during training.

Parameters

negativeSlope

The slope for negative values (default: 0.01)

name

Name of the module

Constructors

Link copied to clipboard
constructor(negativeSlope: Float = 0.01f, name: String = "LeakyReLU")

Properties

Link copied to clipboard
open override val modules: List<Module<T, V>>
Link copied to clipboard
open override val name: String
Link copied to clipboard

Functions

Link copied to clipboard
open override fun forward(input: Tensor<T, V>, ctx: ExecutionContext): Tensor<T, V>