SparseCategoricalCrossEntropyLoss

class SparseCategoricalCrossEntropyLoss @JvmOverloads constructor(dim: Int = -1) : Loss(source)

Sparse Categorical Cross-Entropy Loss.

This wraps CrossEntropyLoss and emphasizes the use of integer class indices as targets rather than one-hot encoded targets.

Use this for multi-class classification where:

  • Predictions are logits (pre-softmax) of shape (batch, num_classes)

  • Targets are class indices of shape (batch,) with dtype Int32

This is equivalent to PyTorch's CrossEntropyLoss or TensorFlow's SparseCategoricalCrossentropy.

Parameters

dim

The dimension along which to compute the softmax. Default is -1 (last dimension).

See also

Constructors

Link copied to clipboard
constructor(dim: Int = -1)

Functions

Link copied to clipboard
open override fun <T : DType, V> forward(preds: Tensor<T, V>, targets: Tensor<out DType, *>, ctx: ExecutionContext, reduction: Reduction): Tensor<T, V>