Cost to fine tune large transformer models on the cloud?

hi folks

curious if anyone has experience fine tuning RoBERTa for purposes of text classification for sentiment analysis on a dataset of ~1000 sentences on a model like RoBERTa or BERT large?

similarly, any idea how much it would cost to further pretrain the language model first on 1GB of uncompressed text?

thank you,

mick

Didn’t use RoBERTa, did use BERT. Finetuning BERT can be done with google colab in decent time, i.e. is sort of free.

Pretraining I cannot say in advance. 1 GB of text data is a lot. Try 10MB for a few epochs first to make a rough estimation. Results are also not guaranteed to improve