I was runing the falcon 7b tutorial locally on my RTX A6000 but got an error with an odd mistmach of matrix mult:
File "/lfs/hyperturing1/0/brando9/miniconda/envs/data_quality/lib/python3.10/site-packages/peft/tuners/lora.py", line 565, in forward
result = F.linear(x, transpose(self.weight, self.fan_in_fan_out), bias=self.bias)
RuntimeError: mat1 and mat2 shapes cannot be multiplied (2048x4544 and 1x10614784)
I think it’s caused by lora. I’m really not running anything fancy literally copy paste from tutorial
Do people know how to fix?
Related issue:
cross:
- hf: Unexpected error with falcon 7B, running locally doesn’t work for an odd matrix mismatch dimension error, how to fix?
- dis: Discord
- so: machine learning - Unexpected error with falcon 7B, running locally doesn’t work for an odd matrix mismatch dimension error, how to fix? - Stack Overflow
- reddit: Reddit - Dive into anything