Exporting models

Hello,
Is there possibility to download or export model and then save it, so that I can use models offline?

Thank you

Hi @Katarina, sure there’s a method called AutoModelForXXX.save_pretrained that you can use: Models — transformers 4.3.0 documentation

You can then load the model using the AutoModelForXXX.from_pretrained method :slight_smile:

1 Like

Thank you…yes I have used this I saved model into my library and I get two files: pytorch_model.bin and config.json but when I want to load it I get the error:
OSError: Can’t load config for ‘/user-home/libraries/KK/scripts/config.json’. Make sure that:

  • ‘/user-home/libraries/KK/scripts/config.json’ is a correct model identifier listed on ‘Hugging Face – The AI community building the future.

  • or ‘/user-home/libraries/KK/scripts/config.json’ is the correct path to a directory containing a config.json file

Do you maybe know to tell me if I am doing something wrong…

Can you please share the code snippet that you are using to save and load your model? You can use three back ticks “```” to write code in Markdown :slight_smile:

First I saved a model:

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("Helsinki-NLP/opus-mt-en-sla")

model = AutoModelForSeq2SeqLM.from_pretrained("Helsinki-NLP/opus-mt-en-sla")

model.save_pretrained("../packages/python/models/my_model")

And I checked, I have that folder created with files pytorch_model.bin and config.json
Then I again called it by inserting a path to the folder:

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("../packages/python/models/my_model")
model = AutoModelForSeq2SeqLM.from_pretrained("../packages/python/models/my_model")

But I get the error:

OSError: Can't load tokenizer for '../packages/python/models/my_model'. Make sure that:

- '../packages/python/models/my_model' is a correct model identifier listed on 'https://huggingface.co/models'

- or '../packages/python/models/my_model' is the correct path to a directory containing relevant tokenizer files

Ah you also need to save the tokenizer with

tokenizer.save_pretrained("../packages/python/models/my_model")

Does that solve your problem?

1 Like

yes, thank you:)

1 Like