Package-level declarations

Types

Link copied to clipboard
class BatchNormalization<T : DType, V>(numFeatures: Int, eps: Double = 1.0E-5, momentum: Double = 0.1, affine: Boolean = true, val name: String = "BatchNormalization", initGamma: Tensor<T, V>? = null, initBeta: Tensor<T, V>? = null) : Module<T, V> , ModuleParameters<T, V>

BatchNormalization layer for training stability and performance. Normalizes the input across the batch dimension. https://arxiv.org/abs/1607.06450

Link copied to clipboard
class GroupNormalization<T : DType, V>(numGroups: Int, numChannels: Int, eps: Double = 1.0E-5, affine: Boolean = true, val name: String = "GroupNormalization", initGamma: Tensor<T, V>? = null, initBeta: Tensor<T, V>? = null) : Module<T, V> , ModuleParameters<T, V>

GroupNormalization layer - Alternative normalization approach. Normalizes the input by dividing channels into groups and normalizing within each group.

Link copied to clipboard
class LayerNormalization<T : DType, V>(normalizedShape: IntArray, eps: Double = 1.0E-5, elementwiseAffine: Boolean = true, val name: String = "LayerNormalization", initGamma: Tensor<T, V>? = null, initBeta: Tensor<T, V>? = null) : Module<T, V> , ModuleParameters<T, V>

LayerNormalization layer - Used in attention mechanisms. Normalizes the input across the last dimension(s).