I’m unable to run the model I trained with AutoNLP. I did everything through the UI, but when I make a request to the inference API, I get this error:
Could not load model [model id here] with any of the following classes: (<class 'transformers.models.bert.modeling_bert.BertForSequenceClassification'>, <class 'transformers.models.bert.modeling_tf_bert.TFBertForSequenceClassification'>).
I get the same error when I try the inference through the UI on the model page. Any ideas?