fine tuning encoder decoder for custom language translation

Hello everyone,
I would like to know if you can train a BERT2GPT model (or other models) for translation of customized languages ​​(from scratch). I need to translate gloss signals from ASL to English.

I have already looked for tutorials on the internet, but most of them are for the task of generating text, I cannot find tutorials for translating text.

I read about EncoderDecoder, I think it’s possible, I just don’t know how to make a notebook to perform training from scratch using the hugging face models.

Could you help me? Has anyone done something like that?