First of all i have to clarify that I’m pretty new to transformer models, and hugging face, so while I did my best to learn about it I still consider myself a beginner. Anyway, back to the main topic of my question, I wanted to fine tune a model for the task of going from natural language to mongodb ( a similar task is usually done with SQL), transformer pretrained models are usually used for such tasks so I wanna use T5. I’ve already made a dataset for what I wanna implement but since I’m new to all of this, there are a few doubts I hope you can clear, I’m still learning so my bad if I mix stuff (and thanks in advance):
- is this considered a fine tuning or transfer Learning? (I still can’t tell when to use each one)
- In a way this task is basically a machine translation one, so I’d like to think fine tuning would be the way to go.
- for the sql version of this task, models are usually pretrained on wide range of datasets of which is wikisql, but in my case I don’t think they were pretrained with anything mongodb related and I was wondering if that would affect my models accuracy? If yes would that mean I’ll have to retrain the model then fine tune it?
- I’d love if you could provide me with good resources I can use.
Note: I’ve already started fine tuning T5 on this task so the answers will be reassuring.
Have a nice day.