First time to AI - apps. Do I need a GPU in order to run a model using transformers?

I made this app using transformers and huggingface:

from pathlib import Path
from typing import Union

from pypdf import PdfReader
from transformers import pipeline

question_answerer = pipeline(task="question-answering", model="deepset/tinyroberta-squad2")
question="Έχω ασφάλεια αυτοκινήτου, και είναι ασρφαλισμένο σε περίπτωση φωτιάς (=πυρκαγιάς), τι χρήματα παίρνω?"

def get_text_from_pdf(pdf_file: Union[str, Path]) -> str:
    """Read the PDF from the given path and return a string with its entire content."""
    reader = PdfReader(pdf_file)

    # Extract text from all pages
    full_text = ""
    for page in reader.pages:
        full_text += page.extract_text()
    return full_text


pdf_text = get_text_from_pdf("./pdf.pdf")
answer = question_answerer(question, pdf_text)
print(answer)

But taking the response is slow. So made me having these questions:

  • Is model running locally or remotely?
  • Do I need a GPU to run this model in a decent time?

What is the recommended GPU requirements in o4r5der for the script above to run?

1 Like

Assume you are running the script locally, not in Colab or something.

Is model running locally or remotely?

Locally.

Do I need a GPU to run this model in a decent time?

I think it would definitely be faster with a GPU.

GPU requirements in o4r5der for the script above to run?

This model is only about 300 MB; it looks like a GeForce with 1 GB of VRAM would be fine, so even the cheapest one will work.
However, generally speaking, the minimum is 12GB of VRAM if you are going to do a lot of things with AI. That’s about what mine is! I would say I am the weakest in HF.
And anyway, buy a newer one from nVidia. The other requirements are rather unimportant.

1 Like