I pushed my first model to the hub using push_to_hub
function around an hour ago. I can see it up on the Hub website at https://huggingface.co/zyl1024/bert-base-cased-finetuned-qqp. However, when I try to use it in transformer, by following the instruction on that page,
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("zyl1024/bert-base-cased-finetuned-qqp")
model = AutoModelForSequenceClassification.from_pretrained("zyl1024/bert-base-cased-finetuned-qqp")
I get an error saying
OSError: Can't load tokenizer for 'zyl1024/bert-base-cased-finetuned-qqp'. Make sure that:
- 'zyl1024/bert-base-cased-finetuned-qqp' is a correct model identifier listed on 'https://huggingface.co/models'
- or 'zyl1024/bert-base-cased-finetuned-qqp' is the correct path to a directory containing relevant tokenizer files
Do I need do something else or just wait for a while (if so, how long?) for it to become available for download?