Resource required to fine tune a large model?


I am a student just started learning about NLP.
I recently found below model in huggingface slight_smile:, was was blown away by the performance of the model.

So I have been playing with it and wanted to fine tune to change its tone and feed in some more data to have fluency in certain subject. The problems is that I have never dealt with such a large model, and it crushes due to shortage of GPU power. I have no idea how much resources are needed to deal with such large model so I am not sure whether I should invest on a new GPU or get a cloud VM subscription.

If any of you have tested it, I would like to ask you to share some information about how you did it and how much resources were needed. Thanks for your time!