Hello,
I am new in this forum and Hugging face models. Could someone help with this:
I want to use model âHelsinki-NLP/opus-mt-en-slaâ. I am using code from the site:
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
model = AutoModelForSeq2SeqLM.from_pretrained(âHelsinki-NLP/opus-mt-en-slaâ)
but I get this error: ValueError: Connection error, and we cannot find the requested files in the cached path. Please try again or make sure your Internet connection is on.
I have installed all necessary libraries and my internet connection is goodâŚcould someone help with this?
Thanks
Hello @Katarina, I was able to run your code snippet without any problems on my machine, so I wonder whether there is some firewall / proxy you need to configure on your end?
A simple test that your connection is fine would be to spin up a Google Colab notebook and see if your code works there. Alternatively, you could try upgrading to the latest version of transformers just to be sure itâs not an old bug that got fixed recently.
Hi Lewis, thank you on answer. I checked on Google Colab and there is working fine.
I am traying from Watson Studio, so probably then I have some blockades. Do you maybe know how is possible to incorporate that part - defining a proxy into this code?
I have a related question. I have an environment that doesnât have internet. I downloaded the model inside of a docker container and moved the docker imaged into the no-internet environment.
Itâs still giving me the Connection Error even if I pointed to the cached_dir. Is there anyway to stop the code from requiring a internet connection and just use the cached_dir?
model_name = f'Helsinki-NLP/opus-mt-mul-en'
# Download the model and the tokenizer
cache_dir = "/root/.cache/huggingface/transformers"
model = MarianMTModel.from_pretrained(model_name, cache_dir=cache_dir)
tokenizer = MarianTokenizer.from_pretrained(model_name, cache_dir=cache_dir)
@lewtun What you suggested seems to work. Although this has to be set before from transformers import MarianMTModel, MarianTokenizer if I want to use os package to set environmental variables.
Other than it doesnât follow PEP8, models seem to be loading okay.