cross_entropy_sequence_loss(logits, labels, sequence_length, label_smoothing=0.0, average_in_time=False, training=None)¶
Computes the cross entropy loss of sequences.
logits – The unscaled probabilities.
labels – The true labels.
sequence_length – The length of each sequence.
label_smoothing – The label smoothing value.
average_in_time – If
True, also average the loss in the time dimension.
training – Compute training loss.
A tuple (cumulated loss, loss normalizer, token-level normalizer).