Having an error while loading model on hosted inference api

I fine tuned the “distilroberta-base” model for my text classification project.

After pushing my model to the hub, I am getting the following error on Hosted Inference API;
Can't load config for 'None'. Make sure that: - 'None' is a correct model identifier listed on 'https://huggingface.co/models' - or 'None' is the correct path to a directory containing a config.json file

Would anyone help me out with this??

This is the dataset I used and here is the link to my model, if anyone wants to check my files.

Thanks for helping me out guys!!

1 Like