How to output loss from model.generate()?

I’m using transformers 4.19.2 and torch 1.11.0+cu113, and it still work with out error
And my minimum code is here.

from transformers import BartForConditionalGeneration
from undecorated import undecorated
model = BartForConditionalGeneration.from_pretrained("facebook/bart-large-cnn")
generate_with_grad = undecorated(model.generate)
model.generate_with_grad = MethodType(generate_with_grad, model)

output=model.generate_with_grad(
  input_ids=input_ids,
  output_scores=True,return_dict_in_generate=True,
  output_hidden_states = True,
  )

According to error message, there is an in-place operation at somewhere, so It would be a good idea to do a backward look, commenting out each calculation process line by line from the end to the beginning, to find which calculation process has the problem. Note backward() is possible only for scalars, so for multidimensional tensors, it is better to sum or average as appropriate.