BinaryCrossEntropyLoss
Binary Cross-Entropy Loss.
BCE(pred, target) = -target * log(pred) + (1 - target) * log(1 - pred)
This loss is used for binary classification where predictions are probabilities in the range 0, 1.
For numerical stability, predictions are clamped to epsilon, 1 - epsilon.
Parameters
epsilon
Small value for numerical stability. Default is 1e-7.