NER- not able to identify fruits

from transformers import AutoTokenizer, AutoModelForTokenClassification

from transformers import pipeline

tokenizer = AutoTokenizer.from_pretrained(“dslim/bert-base-NER”)

model = AutoModelForTokenClassification.from_pretrained(“dslim/bert-base-NER”)

nlp = pipeline(“ner”, model=model, tokenizer=tokenizer)

example = [“An apple a day keeps a doctor away.”, “Apple is really good”,“Apple is a healthy fruit”,“Mango is a healthy fruit”]

ner_results = nlp(example)

print(ner_results)

Some weights of the model checkpoint at dslim/bert-base-NER were not used when initializing BertForTokenClassification: [‘bert.pooler.dense.weight’, ‘bert.pooler.dense.bias’] - This IS expected if you are initializing BertForTokenClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model). - This IS NOT expected if you are initializing BertForTokenClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).

[[], [{‘entity’: ‘B-ORG’, ‘score’: 0.992562, ‘index’: 1, ‘word’: ‘Apple’, ‘start’: 0, ‘end’: 5}], [], []]