Summarization - Pegasus - min_length


I am playing around with the minimal number of tokens in the generated output by “google/pegasus-cnn_dailymail”. Basically I follow the documentation so my code looks like this

    batch = tokPeg.prepare_seq2seq_batch(src_texts=[s]).to(torch_device) 
    gen = modelPeg.generate(**batch, 
                        num_beams=int(8), min_lenght=100)
    summary: List[str] = tokPeg.batch_decode(gen, skip_special_tokens=True)

However, when I count number of tokens in output text by len(tokPeg.tokenize(summary[0])) the output text produces fewer tokens than is specified in min_length. Is there anything I am missing?

this might be a red herring, but your code snippet shows “min_lenght” where it should be “min_length”