Oh ! I had transformers 3.3.1 installed. I upgraded to 4.2.1. now it has a different error though
404 Client Error: Not Found for url: https://huggingface.co/nielsr/tapas-base-
finetuned-wtq/resolve/main/config.json
I tried with a different model and with the following script
from transformers import AutoTokenizer, AutoModelForTableQuestionAnswering
tokenizer = AutoTokenizer.from_pretrained("nielsr/tapas-base-finetuned-sqa")
model = AutoModelForTableQuestionAnswering.from_pretrained("nielsr/tapas-base-
finetuned-sqa")
I get the following error
ImportError:
TapasModel requires the torch-scatter library but it was not found in your
environment. You can install it with pip as
explained here: https://github.com/rusty1s/pytorch_scatter.
I tried pip installing from that location but I get a different error
error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++
Build Tools": https://visualstudio.microsoft.com/visual-cpp-build-tools/
I already have Visual C++ 14.x in my machine. I guess now it has to do with my machine.
I ran this myself and it throws an error indeed, indicating that ‘nielsr/tapas-base-finetuned-wtq’ does not exist. This is logical since I deleted that model from the hub, because everything is now under the Google namespace. @lysandre we should update the default model of the TableQuestionAnsweringPipeline to ‘google/tapas-base-finetuned-wtq’ instead of ‘nielsr/tapas-base-finetuned-wtq’.
For now, it works when you pass in an appropriate model name that exists in the model hub (note that you should install the torch-scatter dependency for your environment):
from transformers import pipeline
import pandas as pd
tqa = pipeline("table-question-answering", model="google/tapas-base-finetuned-wtq")