404 when instantiating private model/tokenizer

I think I might be missing something obvious, but when I attempt to load my private model checkpoint with the Auto* classes and use_auth=True I’m getting a 404 response. I couldn’t find anything in the docs about the token/auth setup for the library so I’m not sure what’s wrong.

from transformers import AutoTokenizer, AutoModelWithLMHead

tokenizer = AutoTokenizer.from_pretrained("jodiak/mymodel", use_auth=True)

model = AutoModelWithLMHead.from_pretrained("jodiak/mymodel", use_auth=True)
# 404 Client Error: Not Found for url: https://huggingface.co/jodiak/model/resolve/main/config.json

I’ve verified that I can load this directory locally and also that the path in model hub is correct. Any help here is appreciated.

Hi @jodiak, did you do the login phase ?

https://huggingface.co/transformers/model_sharing.html?highlight=login#basic-steps

That should create a ~/.huggingface/token file (that is being used to see your private model).