Correct way to use pre-trained models

I want to do Multiclass-Multilabel ( MLMC) classification problem using Conv-BERT model.

Steps that I have taken is:

I downloaded the Conv-Bert model from this link: YituTech/conv-bert-base · Hugging Face
<< YituTech/conv-bert-base>>

from pytorch_pretrained_bert import BertTokenizer, BertForSequenceClassification, BertAdam
tokenizer = BertTokenizer.from_pretrained("path_to_Conv-Bert_model", do_lower_case = True)
model = BertForSequenceClassification.from_pretrained("path_to_Conv-Bert_model", num_labels = 240)

I want to understand can we call any classification module from Hugging face and pass any pre-trained models to it like Roberta, Conv-BERT.. so on. ? << As in above example>>

I think the question is a bit too vague but if you mean the BERT-like family of models can be loaded using the code above and similarly for others but ConvBERT for example has its own class (see here: ConvBERT — transformers 4.7.0 documentation).

The head that will be initialized will be of random weights so the model will need fine-tuning.