Huggingface Transformer code successfully gets executed on amazon web services but not on other server

This might be slightly off-topic but I decided to write a question here in case anything helpful can come out.

I have a block of code that makes a use of HuggingFace Transformer models.
I can execute this my code on Amazon Web Services, so I don’t think that there is any syntax/semantic errors in my code.
However, when I run the same code on my university server, I am keep getting the following error:

Traceback (most recent call last):
  File "/home/h56cho/projects/def-schonlau/h56cho/", line 505, in <module>
    main_function('/home/h56cho/projects/def-schonlau/h56cho/G1G2.txt','/home/h56cho/projects/def-schonlau/h56cho/G1G2_answer_num.txt', num_iter)
  File "/home/h56cho/projects/def-schonlau/h56cho/", line 439, in main_function
    gpt2_tokenizer = GPT2Tokenizer.from_pretrained('gpt2')
  File "/localscratch/h56cho.42131937.0/env/lib/python3.8/site-packages/transformers/", line 1623, in from_pretrained
    resolved_vocab_files[file_id] = cached_path(
  File "/localscratch/h56cho.42131937.0/env/lib/python3.8/site-packages/transformers/", line 948, in cached_path
    output_path = get_from_cache(
  File "/localscratch/h56cho.42131937.0/env/lib/python3.8/site-packages/transformers/", line 1124, in get_from_cache
    raise ValueError(
ValueError: Connection error, and we cannot find the requested files in the cached path. Please try again or make sure your Internet connection is on.

I highly doubt that the error is due to the internet connection, so this may has to do with the “cached path”. Can any of your team member suggest me how to solve this issue or why this error is popping up?

Thank you,

Seems likely that the problem is with your internet connection though. Perhaps your server has a strict policy about which files can be downloaded and which ones can’t? Make sure AWS is accessible.

Might also be a problem with access to the cache directory but I would have expected an OSError or PermissionError in that case. You can verify that you have read/write access to ~/.cache/torch/transformers (<V4) or ~/.cache/huggingface/transformers (V4).

1 Like