How to use a privately hosted model for a Space?

Would that be a problem here because the secret’s location might not match with what from_pretrained() is expecting

Precisely! When you pass True to use_auth_token it will look under ~/.huggingface for your token, which would be generated when someone logs in with the CLI. That would usually be done when you’re working locally, but with a Space I think the best thing to do is to pass the token as a string to use_auth_token. You could even do something like

auth_token = os.environ.get("TOKEN_FROM_SECRET") or True
AutoModelForMaskedLM.from_pretrained("my-org/model-name", use_auth_token=auth_token)

so that you can have it work both locally and on your Space easily.

6 Likes