Good afternoon. I’m trying to run a simple code on the Russian Yandex.DataSphere service in the local JupiterLab, which loads the model using a pipeline. But I have such a strange mistake. Can there be a ban from the huggingface service?
Это мой код
from transformers import pipeline
prompt = "Write an email about an alpaca that likes flan"
model = pipeline(model="declare-lab/flan-alpaca-xl")
model(prompt, max_length=128, do_sample=True)
Это моя ошибка:
404 Client Error: Not Found for url: huggingface. co/declare-lab/flan-alpaca-xl/resolve/main/pytorch_model.bin
404 Client Error: Not Found for url: huggingface. co/declare-lab/flan-alpaca-xl/resolve/main/tf_model.h5
404 Client Error: Not Found for url: huggingface. co/declare-lab/flan-alpaca-xl/resolve/main/pytorch_model.bin
404 Client Error: Not Found for url: huggingface. co/declare-lab/flan-alpaca-xl/resolve/main/tf_model.h5
ValueError Traceback (most recent call last)
in
1 from transformers import pipeline
2
----> 3 model = pipeline(model=“declare-lab/flan-alpaca-xl”)
4 #
/usr/local/lib/python3.8/dist-packages/transformers/pipelines/init.py in pipeline(task, model, config, tokenizer, feature_extractor, framework, revision, use_fast, use_auth_token, model_kwargs, pipeline_class, **kwargs)
541 # Will load the correct model if possible
542 model_classes = {“tf”: targeted_task[“tf”], “pt”: targeted_task[“pt”]}
→ 543 framework, model = infer_framework_load_model(
544 model,
545 model_classes=model_classes,
/usr/local/lib/python3.8/dist-packages/transformers/pipelines/base.py in infer_framework_load_model(model, config, model_classes, task, framework, **model_kwargs)
229
230 if isinstance(model, str):
→ 231 raise ValueError(f"Could not load model {model} with any of the following classes: {class_tuple}.")
232
233 framework = “tf” if model.class.name.startswith(“TF”) else “pt”
ValueError: Could not load model declare-lab/flan-alpaca-xl with any of the following classes: (<class ‘transformers.models.auto.modeling_auto.AutoModelForSeq2SeqLM’>, <class ‘transformers.models.auto.modeling_tf_auto.TFAutoModelForSeq2SeqLM’>, <class ‘transformers.models.t5.modeling_t5.T5ForConditionalGeneration’>, <class ‘transformers.models.t5.modeling_tf_t5.TFT5ForConditionalGeneration’>).