Can i use BPE tokenizer for T5 model not pretrain

Hi! I’m beginners in NLP, i want to use T5 model not pretrain:

my_config = T5Config.from_pretrained("/content/config.json")
model = T5ForConditionalGeneration(my_config)

and used my BPE tokenizers:
fast_tokenizer = PreTrainedTokenizerFast(tokenizer_file="/content/tokenizer.json")
fast_tokenizer.add_special_tokens({‘pad_token’: ‘[PAD]’})
fast_tokenizer.padding_side = “left”

, but when i trained and i used model.generate this return a tensor([0,1]) i just train 1 epoch.
Am i wrong anything? or I should train more epochs