Push model to hugging face hub without Trainer

Hi, I want to train a model for text classification using bert-base-uncased with pure Pytorch (without Trainer). I wanted to push the fine tuned model to hugging face hub and I used this code:

model.push_to_hub("repository_name")

But when I’m trying to see the prediction of the model in the model repository on the example sentence I got this error:
Can't load tokenizer using from_pretrained, please update its configuration: Can't load tokenizer for 'Zahra99/pure-python2'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'Zahra99/pure-python2' is the correct path to a directory containing all relevant files for a BertTokenizerFast tokenizer.
This is shown in the following image:

I searched about it but actually I couldn’t find solution for my problem which how I can push the model which is trained using pure Pytorch in hugging face hub and load it later.

I apperciate any help :pray:

What is in 'Zahra99/pure-python2'?

This error comes from here. Maybe this issue can help.

My repository includes these files:

I can’t open this link.

But I try the comments in the other link you sent and uploaded manually tokenizer.json and tokenizer_config.json from bert-base-uncased repository into my repository and now inference API in my repository is working. Thank you so much

But I have a question, is this work really true? and is there any way to upload tokenizer files automatically?

I think only the Trainer uploads them automatically.

I can’t open this link.

Oops, this link points to a private repo :), and it’s not that useful, to be honest.

Thank you so much :blush:
This piece of code solve the problem

tokenizer.push_to_hub() # instead of adding tokenizer files manually to repository
model.push_to_hub()
1 Like

Hi Eventhough I use the above, the error still persist for me, any suggesstions?