Azure ML online endpoint hugging face model zero shot

I deployed this model facebook/bart-large-mnli · Hugging Face in Azure ML using online endpoint but when I try to ping the model I always get the same error 'Internal Server Error . I tried using the “test” in the UI and I place the same example that it’s suggested {
“inputs”: “I have a problem with my iphone that needs to be resolved asap”,
“candidate_labels”: “phone, tablet, computer”
error still continues.

Instance count2

This is the error I have in the “deployment status”

INFO: - “POST /score HTTP/1.1” 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File “/opt/miniconda/lib/python3.8/site-packages/uvicorn/protocols/http/”, line 435, in run_asgi
result = await app( # type: ignore[func-returns-value]
File “/opt/miniconda/lib/python3.8/site-packages/uvicorn/middleware/”, line 78, in call
return await, receive, send)
File “/opt/miniconda/lib/python3.8/site-packages/fastapi/”, line 276, in call
await super().call(scope, receive, send)
File “/opt/miniconda/lib/python3.8/site-packages/starlette/”, line 122, in call
await self.middleware_stack(scope, receive, send)
File “/opt/miniconda/lib/python3.8/site-packages/starlette/middleware/”, line 184, in call
raise exc
File “/opt/miniconda/lib/python3.8/site-packages/starlette/middleware/”, line 162, in call
await, receive, _send)
File “/opt/miniconda/lib/python3.8/site-packages/starlette/middleware/”, line 83, in call
await, receive, send)
File “/opt/miniconda/lib/python3.8/site-packages/starlette/middleware/”, line 79, in call
raise exc
File “/opt/miniconda/lib/python3.8/site-packages/starlette/middleware/”, line 68, in call
await, receive, sender)
File “/opt/miniconda/lib/python3.8/site-packages/fastapi/middleware/”, line 21, in call
raise e
File “/opt/miniconda/lib/python3.8/site-packages/fastapi/middleware/”, line 18, in call
await, receive, send)
File “/opt/miniconda/lib/python3.8/site-packages/starlette/”, line 718, in call
await route.handle(scope, receive, send)
File “/opt/miniconda/lib/python3.8/site-packages/starlette/”, line 276, in handle
await, receive, send)
File “/opt/miniconda/lib/python3.8/site-packages/starlette/”, line 66, in app
response = await func(request)
File “/opt/miniconda/lib/python3.8/site-packages/fastapi/”, line 237, in app
raw_response = await run_endpoint_function(
File “/opt/miniconda/lib/python3.8/site-packages/fastapi/”, line 163, in run_endpoint_function
return await**values)
File “/code/”, line 63, in create_item
pred = inference_handler.handle(payload)
File “/code/”, line 180, in handle
prediction = self.model(data.inputs)
File “/opt/miniconda/lib/python3.8/site-packages/transformers/pipelines/”, line 205, in call
return super().call(sequences, **kwargs)
File “/opt/miniconda/lib/python3.8/site-packages/transformers/pipelines/”, line 1101, in call
return next(
File “/opt/miniconda/lib/python3.8/site-packages/transformers/pipelines/”, line 124, in next
item = next(self.iterator)
File “/opt/miniconda/lib/python3.8/site-packages/transformers/pipelines/”, line 266, in next
processed = self.infer(next(self.iterator), **self.params)
File “/opt/miniconda/lib/python3.8/site-packages/torch/utils/data/”, line 633, in next
data = self._next_data()
File “/opt/miniconda/lib/python3.8/site-packages/torch/utils/data/”, line 677, in _next_data
data = self._dataset_fetcher.fetch(index) # may raise StopIteration
File “/opt/miniconda/lib/python3.8/site-packages/torch/utils/data/_utils/”, line 32, in fetch
File “/opt/miniconda/lib/python3.8/site-packages/transformers/pipelines/”, line 183, in next
processed = next(self.subiterator)
File “/opt/miniconda/lib/python3.8/site-packages/transformers/pipelines/”, line 208, in preprocess
sequence_pairs, sequences = self._args_parser(inputs, candidate_labels, hypothesis_template)
File “/opt/miniconda/lib/python3.8/site-packages/transformers/pipelines/”, line 25, in call
if len(labels) == 0 or len(sequences) == 0:
TypeError: object of type ‘NoneType’ has no len()