cross_entropy_sequence_loss
- opennmt.utils.cross_entropy_sequence_loss(logits, labels, sequence_length=None, label_smoothing=0.0, average_in_time=False, training=None, sequence_weight=None, mask_outliers=False)[source]
Computes the cross entropy loss of sequences.
- Parameters
logits – The unscaled probabilities with shape \([B, T, V]\).
labels – The true labels with shape \([B, T]\).
sequence_length – The length of each sequence with shape \([B]\).
label_smoothing – The label smoothing value.
average_in_time – If
True
, also average the loss in the time dimension.training – Compute training loss.
sequence_weight – The weight of each sequence with shape \([B]\).
mask_outliers – Mask large training loss values considered as outliers.
- Returns
A tuple (cumulated loss, loss normalizer, token-level normalizer).