Hi, I am trying to visualize what from_pretrained do
from transformers import BertModel finbert_bertmodel=BertModel.from_pretrained('ProsusAI/finbert')
#The Bert Model transformer outputting raw hidden-states without any specific head on top.
#While Finbert has BertForSequenceClassification architecture
This is my understanding of from_pretrained for this piece of code. Here, it loads all the corresponding weights of Finbert into the architecture of BertModel.
Similarly, for a different model, say:
Weights of Finbert will be loaded into the architecture of MaskedLM, the weights of last layer of MaskedLM will be randomly initialised as Finbert architecture is of Sequence classification, and not of MaskedLM.
Is my understanding correct?
Please add additional details and any link which has additional information.