Finetuning GPT2 with user defined loss

super().from_pretrained('gpt2')

This line does not make too much sense. If you want to inherit from GPT2LMHeadModel, then
just do:

class GPT2FinetunedWithNgrams(GPT2LMHeadModel):
    def __init__(self, config):
        super().__init__(config)
       # your additional code here

and then:

model = GPT2FinetunedWithNgrams.from_pretrained("gpt2")

If you want to change the loss function you will have to overwrite the forward function here.