Finetuned gpt2 model generates from very begining not from summary

I created a text summary ml model by fine-tuning GPT2 with the custom dataset.

The each training example was created by concatenating inputs and targets like this: ‘Document:{document}\nSummary:{Summary}’

But the problem here is the model starts generating from Document not from Summary. Would be there anyway to handle this problem?

Or it would be something not possible at all?