Using TAPAS model for large datasets

It seems that TAPAS models can only handle around 100 rows or so in a given CSV file :frowning:

This is not very helpful in an Enterprise environment.
How do I train the google/tapas-large-finetuned-wtq model using a CSV file which has around 80,000 rows?
Any code will be helpful.


Hi, Did you get solution for your problem with large dataset using TAPAS model “tapas-large-finetuned-wtq”, if so please provide the solution approach to us, we are also facing the same problem in our need. Thank you.

Unfortunately not :frowning:
We moved to trying out PandasAI library and get visuals based on Excel data.

Hope this will help you somewhat.