Finetuning T5 large for paraphrasing multiple time with the same parameters and data gives different results

I use this script to fine-tune T5 large for paraphrasing (Google Colab).

And i do fine-tuning 2 times from scratch, using the same data and the same parameters and expected that the results are the same. However the results are quite different.

Is there any way to set for example seed, so I can get the same result if everything else is the same.

I am not sure if this means that the results should be the same?

def set_seed(seed):
    random.seed(seed)
    np.random.seed(seed)
 torch.manual_seed(seed)

set_seed(42)

Hi @zokica , are you using colab pro for finetuninig t5-large model?, In my case I am getting RAM issues.
Is there a way to finetune it on free colab instance?