Hi All,
I cannot make my GKE cluster to use meta-llama/Meta-Llama-3.3-70B-Instruct even the web page said:
"
meta-llama/Llama-3.3-70B-Instruct"
You have been granted access to this model
"
with:
cat download_model.py
"
import os
from huggingface_hub import hf_hub_download
token = os.getenv(“HF_TOKEN”)
print(token)
hf_hub_download(repo_id=“meta-llama/Meta-Llama-3.3-70B-Instruct”, filename=“config.json”, token=token)"
"
The HF_TOKEN is a read token. and I change it many time… cannot be posted here.
What this can be?
Thanks a lot