Package-level declarations

Types

Link copied to clipboard
class BatchNormalization<T : DType, V>(numFeatures: Int, eps: Double = 1.0E-5, momentum: Double = 0.1, affine: Boolean = true, val name: String = "BatchNormalization", initGamma: Tensor<T, V>? = null, initBeta: Tensor<T, V>? = null) : Module<T, V> , ModuleParameters<T, V>

BatchNormalization layer for training stability and performance. Normalizes the input across the batch dimension. https://arxiv.org/abs/1607.06450

Link copied to clipboard
interface FusedRmsNormOps

Optional interface that TensorOps implementations can provide to support fused RMS normalization without intermediate tensor allocations.

Link copied to clipboard
class GroupNormalization<T : DType, V>(numGroups: Int, numChannels: Int, eps: Double = 1.0E-5, affine: Boolean = true, val name: String = "GroupNormalization", initGamma: Tensor<T, V>? = null, initBeta: Tensor<T, V>? = null) : Module<T, V> , ModuleParameters<T, V>

GroupNormalization layer - Alternative normalization approach. Normalizes the input by dividing channels into groups and normalizing within each group.

Link copied to clipboard
class LayerNormalization<T : DType, V>(normalizedShape: IntArray, eps: Double = 1.0E-5, elementwiseAffine: Boolean = true, val name: String = "LayerNormalization", initGamma: Tensor<T, V>? = null, initBeta: Tensor<T, V>? = null) : Module<T, V> , ModuleParameters<T, V>

LayerNormalization layer - Used in attention mechanisms. Normalizes the input across the last dimension(s).

Link copied to clipboard
class RMSNormalization<T : DType, V>(normalizedShape: IntArray, eps: Double = 1.0E-5, val name: String = "RMSNormalization", initWeight: Tensor<T, V>? = null) : Module<T, V> , ModuleParameters<T, V>

RMS (Root Mean Square) Normalization layer. Unlike LayerNormalization, RMSNorm has no bias and normalizes using only the root mean square of the input, making it simpler and faster.