Finetuning using transformers

hi, i am trying to finetune a sequence to sequence model.

heres my code snippet for calculating loss and iterating over the responses. but for some reason the outputs are blank. i am not sure if there is something wrong in the code or something else.

epochs = 10
for epoch in range(epochs):
  for i in range(dataset_size):
    temp = tokenized_input_text[i]
    outputs = model(**"cuda:0"))
    logits = outputs.logits[0]
    labels = tokenized_output_text[i][0]
    loss = lossfct(logits,"cuda:0"))

i suspect something is wrong in the last four line porbably cause i dont know much hoe to use then.
any help would be apreciated.