Ensure the sentence is complete during generation

You’re right about EOS token. If I don’t specify max_length parameter, then the model can generate a long text which may stop making sense halfway through or deviates from the context provided. I want the generation to be a bit more natural. Can you please share an example of how StoppingCriteria would work ? Didn’t find the usage example in docs.