Model was working fine for a few weeks until yesterday. Calling the inference API from docker container results in the same error as when using the inference api model card. Error below:
Can't load tokenizer using from_pretrained, please update its configuration: xxxx/wav2vec_xxxxxxxx is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo with use_auth_tokenor log in withhuggingface-cli loginand passuse_auth_token=True.
We are passing the token in the API call, I have tried both an org token and a user token to no avail. Code I am using is the below:
import requests
API_URL = "https://api-inference.huggingface.co/models/xxx/wav2vec_xxxxxx"
headers = {"Authorization": "Bearer xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"}
def query(filename):
with open(filename, "rb") as f:
data = f.read()
response = requests.post(API_URL, headers=headers, data=data)
return response.json()
Okay magically working again. Though a member on our team did add an extra tokeniser.json file that was used by other models that were using the same base model we were using. So maybe that helps.
Though I suspect it was a huggingface bug because number of downloads on the model card was 91 when the model was broken, and now it is down to 79, around the number of downloads before it started working.
Iām still facing this issue and I too think itās a huggingface bug.
Iām trying to access private models through a space, which was working until yesterday (and no changes were made). I have set my token, from which I have access to the model, in the space secrets. Tried both having write permission and read permission, no changes. Iāve also tried integrating it directly in the code but had no luck, still the traceback error is the same:
OSError: / is not a local folder and is not a valid model identifier listed on āModels - Hugging Faceā If this is a private repository, make sure to pass a token having permission to this repo with use_auth_token or log in with huggingface-cli login and pass use_auth_token=True.
In the documentation Iāve also seen that the parameter to be used is ātokenā and not āuse_auth_tokenā. When setting both to my auth token, the traceback suggests to use only ātokenā. Iāve tried that too, but had no luck.
If someone has any idea or is facing the same please let me know!
Facing the same bug. Canāt load tokenizer using from_pretrained, please update its configuration: MealMate/2M_Classifier is not a local folder and is not a valid model identifier listed on āModels - Hugging Faceā If this is a private repository, make sure to pass a token having permission to this repo with use_auth_token or log in with huggingface-cli login and pass use_auth_token=True.