SparseCategoricalCrossEntropyLoss
Sparse Categorical Cross-Entropy Loss.
This wraps CrossEntropyLoss and emphasizes the use of integer class indices as targets rather than one-hot encoded targets.
Use this for multi-class classification where:
Predictions are logits (pre-softmax) of shape (batch, num_classes)
Targets are class indices of shape (batch,) with dtype Int32
This is equivalent to PyTorch's CrossEntropyLoss or TensorFlow's SparseCategoricalCrossentropy.
Parameters
dim
The dimension along which to compute the softmax. Default is -1 (last dimension).