The output of T5 is not consistent on multiple sequences

I want to edit my question but it looks not possible.
adding theses lines

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("t5-small")

model = AutoModelForSeq2SeqLM.from_pretrained("t5-small")

model.resize_token_embeddings(len(tokenizer))

model.to("cuda")

model.eval()