• -h: This help. [false]
  • -md: Dump help in Markdown format. [false]
  • -config: Read options from config file. []
  • -save_config: Save options from config file. []

Data options

  • -src: Source sequence to decode (one line per sequence) []
  • -tgt: True target sequence (optional) []
  • -output: Path to output the predictions (each line will be the decoded sequence) [pred.txt]

Translator options

  • -model: Path to model .t7 file []
  • -beam_size: Beam size [5]
  • -batch_size: Batch size [30]
  • -max_sent_length: Maximum output sentence length. [250]
  • -replace_unk: Replace the generated UNK tokens with the source token that had the highest attention weight. If phrase_table is provided, it will lookup the identified source token and give the corresponding target token. If it is not provided (or the identified source token does not exist in the table) then it will copy the source token [false]
  • -phrase_table: Path to source-target dictionary to replace UNK tokens. See for the format this file should be in []
  • -n_best: If > 1, it will also output an n_best list of decoded sentences [1]
  • -max_num_unks: All sequences with more unks than this will be ignored during beam search [inf]
  • -pre_filter_factor: Optional, set this only if filter is being used. Before applying filters, hypotheses with top beamSize * preFilterFactor scores will be considered. If the returned hypotheses voilate filters, then set this to a larger value to consider more. [1]

Other options

  • -time: Measure batch translation time [false]
  • -gpuid: List of comma-separated GPU identifiers (1-indexed). CPU is used when set to 0. [0]
  • -fallback_to_cpu: If GPU can't be use, rollback on the CPU. [false]
  • -fp16: Use half-precision float on GPU. [false]
  • -no_nccl: Disable usage of nccl in parallel mode. [false]
  • -log_file: Outputs logs to a file under this path instead of stdout. []
  • -disable_logs: When activated, output nothing. [false]
  • -log_level: (DEBUG, INFO, WARNING, ERROR) Outputs logs at this level and above. [INFO]