How can i skip GPT2LMHeadModel embedding layers?

@dharmendra If your model inherits from PreTrainedModel, it should work out of the box. If it doesn’t work, please open an issue in transformers :slight_smile:

1 Like