Will this Inference Endpoint work?

I want to deploy the cardiffnlp/twitter-roberta-base-offensive model using Inference Endpoints but when I click on the option to do this, I get the message:

‘Warning: deploying this model will probably fail because no “tokenizer.json” file was found in the repository. Try selecting a different model or creating a custom handler.’

However, I saw that there are vocab.json, special_tokens_map.json and merges.txt files which may allow the model to run fine using these as tokenizers. Would it be safe to deploy this model as is using Inference Endpoints or is there still an issue? If there is an issue, how can I resolve this issue so that I can deploy this model? I appreciate your help because I do want to deploy this model using Inference Endpoints if possible.

1 Like

Hi @Jordan93 You can use this model with Inference Endpoints, but it will need a custom handler - more information on creating and using one can be found here: Create custom Inference Handler.

1 Like