I’m encountering an issue when trying to load a model using the Hugging Face transformers
library. Below is the code snippet I’m working with:
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("canopylabs/orpheus-3b-0.1-ft")
model = AutoModelForCausalLM.from_pretrained("canopylabs/orpheus-3b-0.1-ft")
After executing the above code snippet, below error is popping up.
Error: Exception: data did not match any variant of untagged enum ModelWrapper at line 1509159 column 3
Has anyone encountered this issue before? If so, could you help me understand what might be causing this error and how I can fix it? Any pointers or solutions would be greatly appreciated!
Thanks in advance for your help!