The question-answering example in the doc throws an AttributeError exception. Please help

Hi.

Admittedly I am a beginner to HuggingFace, though I do have some Python experience and general programming experience.

I am using

  • transformers version: 3.5.1
  • Platform: Windows-10-10.0.18362-SP0
  • Python version: 3.6.12
  • PyTorch version (GPU?): 1.7.0 (False)
  • Tensorflow version (GPU?): not installed (NA)
  • Using GPU in script?: No
  • Using distributed or parallel set-up in script?: No

I copy-pasted the following code (see at the bottom of the post) from the Transformers doc on Summary of the tasks — transformers 4.0.0 documentation (huggingface.co)

However this code (I made no changes) returns with the following error:

Traceback (most recent call last):
  File "c:/Workspace/py-conda-workspaces/py36-conda-speechDemo/text-question-answering.py", line 21, in <module>
    answer_start_scores = outputs.start_logits
AttributeError: 'tuple' object has no attribute 'start_logits'```

Here is the code:

from transformers import AutoTokenizer, AutoModelForQuestionAnswering
import torch
tokenizer = AutoTokenizer.from_pretrained("bert-large-uncased-whole-word-masking-finetuned-squad")
model = AutoModelForQuestionAnswering.from_pretrained("bert-large-uncased-whole-word-masking-finetuned-squad")
text = r"""
🤗 Transformers (formerly known as pytorch-transformers and pytorch-pretrained-bert) provides general-purpose
architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet…) for Natural Language Understanding (NLU) and Natural
Language Generation (NLG) with over 32+ pretrained models in 100+ languages and deep interoperability between
TensorFlow 2.0 and PyTorch.
"""
questions = [
    "How many pretrained models are available in 🤗 Transformers?",
    "What does 🤗 Transformers provide?",
    "🤗 Transformers provides interoperability between which frameworks?",
]
for question in questions:
    inputs = tokenizer(question, text, add_special_tokens=True, return_tensors="pt")
    input_ids = inputs["input_ids"].tolist()[0]
    text_tokens = tokenizer.convert_ids_to_tokens(input_ids)
    outputs = model(**inputs)
    answer_start_scores = outputs.start_logits
    answer_end_scores = outputs.end_logits
    answer_start = torch.argmax(
        answer_start_scores
    )  # Get the most likely beginning of answer with the argmax of the score
    answer_end = torch.argmax(answer_end_scores) + 1  # Get the most likely end of answer with the argmax of the score
    answer = tokenizer.convert_tokens_to_string(tokenizer.convert_ids_to_tokens(input_ids[answer_start:answer_end]))
    print(f"Question: {question}")
    print(f"Answer: {answer}")

Please help. What did I do wrong?

You’re using code aimed at transformers v4 with a previous version so it doesn’t work :slight_smile:
You can either:

  • upgrade your installation
  • replace the line defining your model by this:
model = AutoModelForQuestionAnswering.from_pretrained("bert-large-uncased-whole-word-masking-finetuned-squad", return_dict=True)

Thank you. I upgraded using pip install --upgrade transformers and it all worked :smiley:

Hi

Though my original issue was resolved, I now have a new one: the transformers-cli is crashing.

I type

transformers-cli --help

and the result is

Traceback (most recent call last):
  File "c:\users\gilad\miniconda3\envs\speechdemoenv_nmt\lib\runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "c:\users\gilad\miniconda3\envs\speechdemoenv_nmt\lib\runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "C:\Users\Gilad\miniconda3\envs\speechDemoEnv_NMT\Scripts\transformers-cli.exe\__main__.py", line 4, in <module>
  File "c:\users\gilad\miniconda3\envs\speechdemoenv_nmt\lib\site-packages\transformers\commands\transformers_cli.py", line 4, in <module>
    from transformers.commands.add_new_model import AddNewModelCommand
  File "c:\users\gilad\miniconda3\envs\speechdemoenv_nmt\lib\site-packages\transformers\commands\add_new_model.py", line 8, in <module>   
    from cookiecutter.main import cookiecutter
ModuleNotFoundError: No module named 'cookiecutter'

Help!

You need to use pip install cookiecutter. We’ll fix that dependency on Monday!

Thanks that worked

transformers.version
‘4.39.3’

I am still facing this issue when trying to get peft model

AttributeError                            Traceback (most recent call last)
Cell In[14], line 1
----> 1 model = get_peft_model(model, peft_config)

File /opt/conda/lib/python3.11/site-packages/peft/mapping.py:126, in get_peft_model(model, peft_config, adapter_name, mixed)
    123 if hasattr(model_config, "to_dict"):
    124     model_config = model_config.to_dict()
--> 126 peft_config.base_model_name_or_path = model.__dict__.get("name_or_path", None)
    128 if mixed:
    129     return PeftMixedModel(model, peft_config, adapter_name=adapter_name)

AttributeError: 'tuple' object has no attribute '__dict__'