Unexpected error with falcon 7B, running locally doesn't work for an odd matrix mismatch dimension error, how to fix?

I was runing the falcon 7b tutorial locally on my RTX A6000 but got an error with an odd mistmach of matrix mult:

  File "/lfs/hyperturing1/0/brando9/miniconda/envs/data_quality/lib/python3.10/site-packages/peft/tuners/lora.py", line 565, in forward
    result = F.linear(x, transpose(self.weight, self.fan_in_fan_out), bias=self.bias)
RuntimeError: mat1 and mat2 shapes cannot be multiplied (2048x4544 and 1x10614784)

I think it’s caused by lora. I’m really not running anything fancy literally copy paste from tutorial

Do people know how to fix?

Related issue:


cross:

What tutorial are you following? You’re trying to fine-tune? Does it work if you try another method besides lora?

If you’re just trying to run inference with falcon, may want to look to an API service: Chain Conductor - Home