Error while access the "canopylabs/orpheus-3b-0.1-ft" model

I’m encountering an issue when trying to load a model using the Hugging Face transformers library. Below is the code snippet I’m working with:

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("canopylabs/orpheus-3b-0.1-ft")
model = AutoModelForCausalLM.from_pretrained("canopylabs/orpheus-3b-0.1-ft")

After executing the above code snippet, below error is popping up.
Error: Exception: data did not match any variant of untagged enum ModelWrapper at line 1509159 column 3

Has anyone encountered this issue before? If so, could you help me understand what might be causing this error and how I can fix it? Any pointers or solutions would be greatly appreciated!

Thanks in advance for your help!

2 Likes

I think the Transformers library doesn’t support that model yet.

Thinking the same but I found the above snippet here under “use this model”.

1 Like

under “use this model”.

Since it is automatically generated, it is safer to give priority to the model card (the description written by the author himself).:sweat_smile: