Finetuning conditional language model generation

Hi
I need to find an example where one can use conditional generation like [1], but finetune this on some data, this is like recent work on pormpt tuning for seq2seq models. could you assist me what is the closest codebase to do it with gpt-2/3? I looked into summarization and translation folder for pytorch, but for some reason they dont support GPT-2. thanks for your help.

[1] https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-generation/run_generation.py

1 Like