BCEWithLogitsLoss

Binary Cross-Entropy Loss with Logits (numerically stable).

This version accepts raw logits (pre-sigmoid) and computes BCE in a numerically stable way using:

BCE(x, y) = max(x, 0) - x * y + log(1 + exp(-|x|))

This is more stable than applying sigmoid and then BCE separately.

Constructors

Link copied to clipboard
constructor()

Functions

Link copied to clipboard
open override fun <T : DType, V> forward(preds: Tensor<T, V>, targets: Tensor<out DType, *>, ctx: ExecutionContext, reduction: Reduction): Tensor<T, V>