Finetuning using transformers

hi, i am trying to finetune a sequence to sequence model.

heres my code snippet for calculating loss and iterating over the responses. but for some reason the outputs are blank. i am not sure if there is something wrong in the code or something else.

epochs = 10
for epoch in range(epochs):
  for i in range(dataset_size):
    temp = tokenized_input_text[i]
    outputs = model(**temp.to("cuda:0"))
    logits = outputs.logits[0]
    labels = tokenized_output_text[i][0]
    loss = lossfct(logits,labels.to("cuda:0"))
    model.train()
    loss.backward()
    optimizer.step()
    optimizer.zero_grad(set_to_none=True)

i suspect something is wrong in the last four line porbably cause i dont know much hoe to use then.
any help would be apreciated.