I subscribe huggingface for private model repository.
I upload my model but how can I use my model?
I can`t use it because when I use it with model name(eg, kykim/bert-kor-large) for fine-tuning, command line shows that this is not listed in huggingface.co/models.
@kykim Thanks for your question/feedback, this is helpful!
Can you copy the command you’re launching here?
To instantiate a private model from transformers you need to add a use_auth_token=True param (should be mentioned when clicking the “Use in transformers” button on the model page):
tokenizer = AutoTokenizer.from_pretrained("username/model_name", use_auth_token=True)
model = AutoModelForMaskedLM.from_pretrained("username/model_name", use_auth_token=True)
If you’re using a fine-tuning script, for now you will have to modify it to add this parameter yourself (to all the from_pretrained() calls).
We’ll make sure to update the code in the repo in the coming weeks too (cc @sgugger)