Load Custom Model

I tried look into this similar issue: Custom GPT2 Model won't load after training

but still…

class CustomClass(PreTrainedModel):
    def __init__(self, config, num_labels):
        super().__init__(config, num_labels)
        self.distilbert = DistilBertModel.from_pretrained('distilbert-base-uncased')
        self.pre_classifier = torch.nn.Linear(768, 768)
        self.dropout = torch.nn.Dropout(0.1)
        self.classifier = torch.nn.Linear(768, num_labels)

    def forward(self, input_ids, attention_mask):
        distilbert_output = self.distilbert(input_ids=input_ids, attention_mask=attention_mask)
        hidden_state = distilbert_output[0]
        pooled_output = hidden_state[:, 0]
        pooled_output = self.pre_classifier(pooled_output)
        pooled_output = torch.nn.Tanh()(pooled_output) 
        pooled_output = self.dropout(pooled_output)  
        output = self.classifier(pooled_output)  
        return output

I was able to fine tune with a linear classifier for a classification job.

config = PretrainedConfig(name_or_path='own-model', num_labels=100, output_hidden_states=True)

model = CustomClass(config, 100)

I can also save it with

model.save_pretrained(PATH)

But when I try to load it with

new_model=PreTrainedModel.from_pretrained('./PATH/')

i got 'NoneType' object has no attribute 'from_pretrained' which is really strange

Alternatively, with a config file, this doesnt throw error though the model weights wont load

new_config=PretrainedConfig.from_pretrained('./PATH/') new_model=PreTrainedModel.from_pretrained('./PATH/', config=new_config)

please help. I am running out of idea. Oh and I am at v4.6.0

Thank you!

You should use CustomClass.from_pretrained. PreTrainedModel.from_pretrained won’t work directly.

CustomClass.from_pretrained didnt work for me, and still getting the same error. Not sure why from_pretrained isn’t inherited as a class method.

But this “seems” to work. Strange it didnt need from_pretrained but just instantiate a model. I read it somewhere loading with config wont load weights, so I suspect it is still NOT done.

cfg = PretrainedConfig.from_pretrained('./PATH/')
model = CustomClass(cfg, num_labels=100)

Oh you need to add some class variables to your custom model, specifically:

config_class = CustomConfig

and have your config be an instance of a CustomConfig (it might work with config_class = PretrainedConfig but I’m not 100% sure).

class CustomClass(PreTrainedModel):

    class_config = PretrainedConfig   # class attribute like this here?

    def __init__(self, config, num_labels):
        super().__init__(config, num_labels)
        self.distilbert = DistilBertModel.from_pretrained('distilbert-base-uncased')
        self.pre_classifier = torch.nn.Linear(768, 768)
        self.dropout = torch.nn.Dropout(0.1)
        self.classifier = torch.nn.Linear(768, num_labels)

    def forward(self, input_ids, attention_mask):
        distilbert_output = self.distilbert(input_ids=input_ids, attention_mask=attention_mask)
        hidden_state = distilbert_output[0]
        pooled_output = hidden_state[:, 0]
        pooled_output = self.pre_classifier(pooled_output)
        pooled_output = torch.nn.Tanh()(pooled_output) 
        pooled_output = self.dropout(pooled_output)  
        output = self.classifier(pooled_output)  
        return output

I added it like above but still no luck to load it with

new_config=PretrainedConfig.from_pretrained('./PATH/') 

new_model=CusomClass.from_pretrained('./PATH/', config=new_config)

Any notebook that I can look at for reference. This is a very common pattern I suppose.

Looks like this is the way to load it with from_pretrained

new_config=PretrainedConfig.from_pretrained('./PATH/') 
new_model=CustomClass.from_pretrained(pretrained_model_name_or_path='./PATH/', config=new_config, num_labels=100)

I have to verify the weights are also loaded successfully.

By doing above, it is still not picking the weights properly. Can you share any reference where examples of inheriting PreTrainedModel is used to create custom model? And how to re-load it with from_pretrained?

@daboliang did you fix it? I am running into the same problem.

@LearnToGrow are still struggling? It’s working for me. I can save my my custom model with my custom head and are able to load it afterwards with the correct weights.

Important is that you use keyword arguments. So you need to write:
new_model=CustomClass.from_pretrained(pretrained_model_name_or_path='./PATH/', config=new_config)
instead of only:
new_model=CustomClass.from_pretrained('./PATH/', new_config)

And I use a custom config which I assume will be necessary for all custom models. https://huggingface.co/docs/transformers/custom_models

Hope that helps!