"No token was detected" when using Hosted inference API

Hello, when I want to use the models in the hub to make inferences on my local test cases, there is error: No token was detected

for example, in the token classification task, xlm-roberta-large-finetuned-conll03-english · Hugging Face

directly running the defalu test case shows no error:

but when replace with my own local test case, it produces the error.

So I wonder if there is any system limitations ? Any reasons for this ? Actually I have test many models in the category of [token classification task], all the same error.

2 Likes

Facing the same issue

I’m facing the same issue. Does anyone have a solution?

Seems we all have the same issue.
Has someone found a solution for it?
Who could we contact about this?