NeuralNetworkDsl

Core DSL interface for building neural networks with generic tensor types. This interface provides a fluent API for constructing neural network architectures with support for different data types and precision levels.

Parameters

T

The data type (DType) that determines the precision and storage format

V

The value type that corresponds to the native Kotlin type for the DType

Type constraints ensure compatibility between DType and value type:

  • T must extend DType to ensure valid tensor operations

  • V should match the native type expected by the DType implementation

Performance considerations:

  • FP32/Float: Best accuracy, higher memory usage

  • FP16/Float: Reduced memory, slightly lower accuracy

  • Int8/Byte: Minimal memory, quantized operations

  • Int32/Int: Integer operations, specific use cases

Inheritors

Functions

Link copied to clipboard
abstract fun activation(id: String = "", activation: (Tensor<T, V>) -> Tensor<T, V>)

Applies an activation function as a separate layer.

Link copied to clipboard
abstract fun avgPool2d(kernelSize: Pair<Int, Int>, stride: Pair<Int, Int> = kernelSize, padding: Pair<Int, Int> = 0 to 0, countIncludePad: Boolean = true, id: String = "")

Creates a 2D average pooling layer for downsampling feature maps.

Link copied to clipboard
abstract fun batchNorm(numFeatures: Int, eps: Double = 1.0E-5, momentum: Double = 0.1, affine: Boolean = true, id: String = "")

Creates a batch normalization layer for training stability and performance. Normalizes the input across the batch dimension.

Link copied to clipboard
abstract fun conv1d(outChannels: Int, kernelSize: Int, stride: Int = 1, padding: Int = 0, dilation: Int = 1, groups: Int = 1, bias: Boolean = true, id: String = "", content: CONV1D<T, V>.() -> Unit = {})

Creates a 1D convolutional layer for processing sequence data.

Link copied to clipboard
abstract fun conv2d(id: String = "", content: CONV2D<T, V>.() -> Unit)

Creates a 2D convolutional layer with all parameters configured inside the DSL block. Example: conv2d("conv1") { outChannels = 16 kernelSize(5) stride(1) padding(2) }

abstract fun conv2d(outChannels: Int, kernelSize: Pair<Int, Int>, stride: Pair<Int, Int> = 1 to 1, padding: Pair<Int, Int> = 0 to 0, dilation: Pair<Int, Int> = 1 to 1, groups: Int = 1, bias: Boolean = true, id: String = "", content: CONV2D<T, V>.() -> Unit = {})

Creates a 2D convolutional layer for processing spatial data like images.

Link copied to clipboard
abstract fun conv3d(outChannels: Int, kernelSize: Triple<Int, Int, Int>, stride: Triple<Int, Int, Int> = Triple(1, 1, 1), padding: Triple<Int, Int, Int> = Triple(0, 0, 0), dilation: Triple<Int, Int, Int> = Triple(1, 1, 1), groups: Int = 1, bias: Boolean = true, id: String = "", content: CONV3D<T, V>.() -> Unit = {})

Creates a 3D convolutional layer for processing volumetric data.

Link copied to clipboard
abstract fun dense(id: String = "", content: DENSE<T, V>.() -> Unit = {})

Creates a dense layer without specifying output dimension (must be set in content block).

abstract fun <TLayer : DType> dense(id: String = "", content: DENSE<TLayer, V>.() -> Unit = {}): Module<T, V>

Creates a dense layer with precision override without specifying output dimension.

abstract fun dense(outputDimension: Int, id: String = "", content: DENSE<T, V>.() -> Unit = {})

Creates a dense (fully connected) layer with specified output dimension.

abstract fun <TLayer : DType> dense(outputDimension: Int, id: String = "", content: DENSE<TLayer, V>.() -> Unit = {}): Module<T, V>

Creates a dense layer with precision override and specified output dimension. This allows individual layers to use different precision than the network default.

Link copied to clipboard
fun <T : DType, V> NeuralNetworkDsl<T, V>.elu(alpha: Float = 1.0f, id: String = "")

Adds a elu activation layer to the network.

Link copied to clipboard
abstract fun flatten(id: String = "", content: FLATTEN<T, V>.() -> Unit = {})

Creates a flatten layer that reshapes multidimensional tensors into 1D. Useful for transitioning from convolutional to dense layers.

Link copied to clipboard
fun <T : DType, V> NeuralNetworkDsl<T, V>.gelu(id: String = "")

Adds a gelu activation layer to the network.

Link copied to clipboard
abstract fun groupNorm(numGroups: Int, numChannels: Int, eps: Double = 1.0E-5, affine: Boolean = true, id: String = "")

Creates a group normalization layer - alternative normalization approach. Normalizes the input by dividing channels into groups and normalizing within each group.

Link copied to clipboard
abstract fun input(inputSize: Int, id: String = "", requiresGrad: Boolean = false)

Creates an input layer that defines the entry point for data into the network.

Link copied to clipboard
abstract fun layerNorm(normalizedShape: IntArray, eps: Double = 1.0E-5, elementwiseAffine: Boolean = true, id: String = "")

Creates a layer normalization layer - used in attention mechanisms. Normalizes the input across the last dimension(s).

Link copied to clipboard
fun <T : DType, V> NeuralNetworkDsl<T, V>.leakyRelu(negativeSlope: Float = 0.01f, id: String = "")

Adds a leakyRelu activation layer to the network.

Link copied to clipboard
abstract fun maxPool2d(id: String = "", content: MAXPOOL2D<T, V>.() -> Unit)

Creates a 2D max pooling layer with all parameters configured inside the DSL block. Example: maxPool2d("pool1") { kernelSize(2) stride(2) padding(0) }

abstract fun maxPool2d(kernelSize: Pair<Int, Int>, stride: Pair<Int, Int> = kernelSize, padding: Pair<Int, Int> = 0 to 0, id: String = "")

Creates a 2D max pooling layer for downsampling feature maps.

Link copied to clipboard
fun <T : DType, V> NeuralNetworkDsl<T, V>.relu(id: String = "")

Adds a relu activation layer to the network.

Link copied to clipboard
abstract fun sequential(content: NeuralNetworkDsl<T, V>.() -> Unit)

Groups layers into a sequential block for better organization.

Link copied to clipboard
fun <T : DType, V> NeuralNetworkDsl<T, V>.sigmoid(id: String = "")

Adds a sigmoid activation layer to the network.

Link copied to clipboard
fun <T : DType, V> NeuralNetworkDsl<T, V>.silu(id: String = "")

Adds a silu activation layer to the network.

Link copied to clipboard
abstract fun softmax(dim: Int = -1, id: String = "")

Applies a Softmax activation as a separate layer.

Link copied to clipboard
abstract fun stage(id: String, content: NeuralNetworkDsl<T, V>.() -> Unit)

Creates a named stage/block within the network for modular design.

abstract fun <TStage : DType> stage(id: String, content: NeuralNetworkDsl<TStage, V>.() -> Unit): Module<T, V>

Creates a precision-scoped stage within the network. This allows grouping layers with a specific precision type that differs from the network default, enabling fine-grained mixed-precision control.

Link copied to clipboard
abstract fun upsample2d(id: String = "", content: UPSAMPLE2D<T, V>.() -> Unit)

Creates a 2D upsampling layer with parameters configured in the DSL block.

abstract fun upsample2d(scale: Pair<Int, Int> = 2 to 2, mode: UpsampleMode = UpsampleMode.Nearest, alignCorners: Boolean = false, id: String = "")

Creates a 2D upsampling layer for increasing spatial resolution.