Tensor size error in PEFT(Prefix Tuning)


I am trying Prefix Tuning using PEFT library on a multilabel classification problem. I am getting the following error:
“RuntimeError: The size of tensor a (512) must match the size of tensor b (492) at non-singleton dimension 1”

The num_virtual_tokens is set to 20. Model used is: “bert-base-uncased”

Any help in resolving this error?


Hi @shreyans92dhankhar

I am getting the same error when trying to fine tune a TOKE N_CLS task type using prefix tuning with PEFT library.

Did you solve the problem?

Thanks in advance

Hi @Luciano, I was able to solve this issue by truncating the tokenizer at (512-num_virtual_tokens which in my case was 20), so i truncated the tokenizer at 492 tokens instead of 512. With these changes it worked.

1 Like

Thanks a lot @shreyans92dhankhar , this worked for me too!

tokenizer = AutoTokenizer.from_pretrained(model_checkpoint, use_fast=True, model_max_length=512)

tokenizer = AutoTokenizer.from_pretrained(model_checkpoint, use_fast=True, model_max_length=492)

1 Like

That’s great, glad the solution is helpful.

I was facing a similar problem, but I was getting the same error even after changing the truncation for the tokenizer. However, upon enabling developer mode in Windows and redownloading the model configuration from huggingface fixed it for me.