Hello!
I tried updating our gadio implementation in our HF Space, to better handle the increased traffic we have had the last few days. The build seems to be successful but we get a connection error when loading the tokenizer (although I guess that part is anecdotal and the connection error might be more general?). Here is the full traceback:
Traceback (most recent call last):
File "app.py", line 19, in <module>
tokenizer = AutoTokenizer.from_pretrained('gpt2')
File "/home/user/.local/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 464, in from_pretrained
tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
File "/home/user/.local/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 330, in get_tokenizer_config
resolved_config_file = cached_path(
File "/home/user/.local/lib/python3.8/site-packages/transformers/file_utils.py", line 1491, in cached_path
output_path = get_from_cache(
File "/home/user/.local/lib/python3.8/site-packages/transformers/file_utils.py", line 1715, in get_from_cache
raise ValueError(
ValueError: Connection error, and we cannot find the requested files in the cached path. Please try again or make sure your Internet connection is on.
Any ideas what we can do to fix this?
Thanks in advance,
Theodore.