How to use CausalLM model to pre-train, and use SequenceClassification model to fine-tune?

How to use AutoModelForCausalLM to pre-train on my own dataset, and use AutoModelForSequenceClassification to fine-tune?

From public pretrained:


from transformers import AutoModelForSequenceClassification

model = AutoModelForSequenceClassification.from_pretrained("bert-base-cased", num_labels=15)

model.save_pretrained("./bert_pretrained")


From your own pretrained:


# First from public pretrained:

from transformers import AutoModelForCausalLM

model = AutoModelForCausalLM.from_pretrained("bert-base-cased", is_decoder=True)

model.save_pretrained("./bert_pretrained2")


# From your own pretrained:

from transformers import AutoModelForSequenceClassification

model = AutoModelForSequenceClassification.from_pretrained("./bert_pretrained2", num_labels=15)

model.save_pretrained("./bert_pretrained3")