Fine tune T5 for text generation in specific domain

1 , I want to continue training a T5 model in huggingface on my own corpus ( about a specific domain)
2, Then I want to fine tune this model for text generation
I am worried that the model has a conflict between the 2 steps.
So, Is this possible to do?

I’m not sure what you mean. Training a model on a corpus is fine-turning a model. Make sure you choose a model well suited to the task you want your model to perform. For text generation you should start with a causal language model (choose text generation on the models page). Then I suggest making sure your corpus is well formatted so you don’t get anything you don’t want in your generations. The hardest part is managing the server resources for training.

Yes it is possible and is pretty easy now with the help of the HF team.

here are some resources and I guess do search for more on youtube or the HF blogs.