Run models on a desktop computer?

hello @antcodes ,

Yes, you can run all models from the hub locally.
Maybe you can start by here: Installation
Setting up a local python environment, and installing the required packages.

For example, if you run this code, from base-NER
It will download the model to your local cache.

You can read more about the pipelines here

from transformers import AutoTokenizer, AutoModelForTokenClassification
from transformers import pipeline

tokenizer = AutoTokenizer.from_pretrained("dslim/bert-base-NER")
model = AutoModelForTokenClassification.from_pretrained("dslim/bert-base-NER")

nlp = pipeline("ner", model=model, tokenizer=tokenizer)
example = "My name is Wolfgang and I live in Berlin"

ner_results = nlp(example)
print(ner_results)
5 Likes