Unable to Finetune Deberta

I am trying to finetune deberta for irony detection task, colab’s notebook link can be found here

When I try to use ‘microsoft/deberta-v3-base’ checkpoint with AutoModel, I’m getting the following error :

RuntimeError: Expected target size [32, 2], got [32]

but when I use the same model with ‘bert-base-uncased’ or roberta (with some changes in head) it works fine. The one can find working code for bert based in this notebook.

When I printed the shapes of predictions and labels, I got outputs as torch.Size([32, 30, 2]), torch.Size([32]) respectively. In the case of bert, shapes of outputs were torch.Size([32, 2]), torch.Size([32]) for predictions and labels.

Here 32 is the batch size, and 30 is the sequence length.

Can someone let me know what I’m doing wrong?