opennmt.utils.losses module

Define losses.

opennmt.utils.losses.cross_entropy_sequence_loss(logits, labels, sequence_length, label_smoothing=0.0, average_in_time=False, mode='train', training=None)[source]

Computes the cross entropy loss of sequences.

Parameters:
  • logits – The unscaled probabilities.
  • labels – The true labels.
  • sequence_length – The length of each sequence.
  • label_smoothing – The label smoothing value.
  • average_in_time – If True, also average the loss in the time dimension.
  • mode – A tf.estimator.ModeKeys mode.
  • training – Compute training loss. If not set, infer training mode from mode.
Returns:

A tuple (cumulated loss, loss normalizer, token-level normalizer).

opennmt.utils.losses.cross_entropy_loss(logits, labels, label_smoothing=0.0, mode='train', training=None)[source]

Computes the cross entropy loss.

Parameters:
  • logits – The unscaled probabilities.
  • labels – The true labels.
  • label_smoothing – The label smoothing value.
  • mode – A tf.estimator.ModeKeys mode.
  • training – Compute training loss. If not set, infer training mode from mode.
Returns:

The cumulated loss and the loss normalizer.