I’m a beginner and am attempting a fresh install of Phi-3-mini-4k-instruct on Windows 10 and am having trouble loading the model. My python script seems to end with: “Loading checkpoint shards: 0%|”. No error messages appear in the console session. The system is a Windows 10 PC running Python 3.11.9. I’m wondering if anyone else having challenges running Phi-3-mini-4k-instruct on Windows. Here’s the script.
#===================================================
Script Name: envtest_phi3_mini_4k.py
Purpose: a hello world test of Hugging Face dev env
Author: Tony B
#===================================================
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = “microsoft/Phi-3-mini-4k-instruct”
print(f"Step 1: Script Launched using model name: {model_name}")
Load the model and tokenizer
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True)
print(f"Step 2: Model and Tokenizer loaded")
Generate text from a prompt
prompt = “Explain the benefits of using machine learning in healthcare:”
inputs = tokenizer(prompt, return_tensors=“pt”, return_overflowing_tokens=False)
outputs = model.generate(inputs[‘input_ids’], max_length=100)
print(f"Step 3: Output loaded from PHI3")
Decode and print the output
generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(f"Step 4: Generated Text: {generated_text}")