Valueerror "too many rows" with Tapas/TableQuestionAnswering pipeline - How to fix it?

Hi guys! :wave:

I wanted to query a dataframe via the "table-question-answering" pipeline. It works well with small dataframes, however as soon as I import larger dataframes (e.g. with ~400 rows), I’ve got the following issue:

valueerror "too many rows"

Any idea what may be happening here?

Thanks in advance :pray:

Charly

pinging @lysandre

He’s on vacation so you might have to wait for to weeks :wink: Looking at the code, you past more rows than allowed by tokenizer.max_row_id, sp you should send a shorter table.
There also seems to be an option drop_rows_to_fit=True that you can pass to avoid this error.

1 Like

Thank you Sylvain! I’ll give it a whirl! :pray: