Hello,
Is there possibility to download or export model and then save it, so that I can use models offline?
Thank you
Hello,
Is there possibility to download or export model and then save it, so that I can use models offline?
Thank you
Hi @Katarina, sure thereās a method called AutoModelForXXX.save_pretrained
that you can use: Models ā transformers 4.3.0 documentation
You can then load the model using the AutoModelForXXX.from_pretrained
method
Thank youā¦yes I have used this I saved model into my library and I get two files: pytorch_model.bin and config.json but when I want to load it I get the error:
OSError: Canāt load config for ā/user-home/libraries/KK/scripts/config.jsonā. Make sure that:
ā/user-home/libraries/KK/scripts/config.jsonā is a correct model identifier listed on āHugging Face ā The AI community building the future.ā
or ā/user-home/libraries/KK/scripts/config.jsonā is the correct path to a directory containing a config.json file
Do you maybe know to tell me if I am doing something wrongā¦
Can you please share the code snippet that you are using to save and load your model? You can use three back ticks ā```ā to write code in Markdown
First I saved a model:
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("Helsinki-NLP/opus-mt-en-sla")
model = AutoModelForSeq2SeqLM.from_pretrained("Helsinki-NLP/opus-mt-en-sla")
model.save_pretrained("../packages/python/models/my_model")
And I checked, I have that folder created with files pytorch_model.bin and config.json
Then I again called it by inserting a path to the folder:
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("../packages/python/models/my_model")
model = AutoModelForSeq2SeqLM.from_pretrained("../packages/python/models/my_model")
But I get the error:
OSError: Can't load tokenizer for '../packages/python/models/my_model'. Make sure that:
- '../packages/python/models/my_model' is a correct model identifier listed on 'https://huggingface.co/models'
- or '../packages/python/models/my_model' is the correct path to a directory containing relevant tokenizer files
Ah you also need to save the tokenizer with
tokenizer.save_pretrained("../packages/python/models/my_model")
Does that solve your problem?
yes, thank you:)