Hi there,
I hope one of you can help me to solve my problem. Transformer Version: Version: 4.33.3.
I tried to download the new mistral modelby using the snippet posted on huggingface. But I got this error message and do not know how to fix it:
“Exception has occurred: KeyError
‘mistral’
File “C:\Users\Stefan Trauth\Desktop\LeoX\Mistral\Mistral 7B.py”, line 5, in ”
I used Snippet 1 directly from Huggingface and Snippet 2 one I normally use created by myself but got in both cases the same error message.
Snippet I:
from transformers import AutoModelForCausalLM, AutoTokenizer
device = “cuda” # the device to load the model onto
model = AutoModelForCausalLM.from_pretrained(“mistralai/Mistral-7B-Instruct-v0.1”)
tokenizer = AutoTokenizer.from_pretrained(“mistralai/Mistral-7B-Instruct-v0.1”)
messages = [
{“role”: “user”, “content”: “What is your favourite condiment?”},
{“role”: “assistant”, “content”: “Well, I’m quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever I’m cooking up in the kitchen!”},
{“role”: “user”, “content”: “Do you have mayonnaise recipes?”}
]
encodeds = tokenizer.apply_chat_template(messages, return_tensors=“pt”)
model_inputs = encodeds.to(device)
model.to(device)
generated_ids = model.generate(model_inputs, max_new_tokens=1000, do_sample=True)
decoded = tokenizer.batch_decode(generated_ids)
print(decoded[0])
Snippet II:
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
Initialize the tokenizer and model
tokenizer = AutoTokenizer.from_pretrained(‘mistralai/Mistral-7B-v0.1’)
model = AutoModelForCausalLM.from_pretrained(‘mistralai/Mistral-7B-v0.1’)
while True:
# Get user input
user_input = input('You: ')
# Encode the input and add end of string token
input_ids = tokenizer.encode(user_input, return_tensors='pt')
# Generate a response from the model
with torch.no_grad():
output = model.generate(input_ids, max_length=50)
# Decode the output and print the answer
answer = tokenizer.decode(output[:, input_ids.shape[-1]:][0], skip_special_tokens=True)
print(f'Mistral-7B: {answer}')
# Check if the user wants to continue
cont = input('Do you want to continue? (yes/no): ')
if cont.lower() == 'no':
break
Thanks a lot.