CategoricalCrossEntropyLoss
Categorical Cross-Entropy Loss.
This wraps CrossEntropyLoss and is provided for API compatibility with frameworks like Keras/TensorFlow that use this naming convention.
Use this for multi-class classification where:
Predictions are logits (pre-softmax) of shape (batch, num_classes)
Targets are either:
One-hot encoded probabilities of shape (batch, num_classes)
Class indices of shape (batch,) with dtype Int32
The loss applies log-softmax internally, so do NOT apply softmax to your model's output before passing to this loss.
Parameters
dim
The dimension along which to compute the softmax. Default is -1 (last dimension).