Why is the Inference API not working for the model I uploaded?

[Sorry for my bad English, English is not my native language.]

This is my first model that I have installed Hugging Face on. (BaranKanat/BerTurk-SpamSMS · Hugging Face) It simply checks whether incoming SMS are SPAM or not.

I simply wrote a python code as follows.

import requests

API_URL = "https://api-inference.huggingface.co/models/BaranKanat/BerTurk-SpamSMS"
headers = {"Authorization": "Bearer hf_XXXXXXXX"}

payload = {"inputs": "2000 TL DENEME BONUSU KAZANDINIZ !!! YATIRIM SARTI YOK KAZANC ve CEKIM LIMITI YOK."}
response = requests.post(API_URL, headers=headers, json=payload)
print(response.json())

But I am getting the following error.

{'error': 'Task not found for this model'}

No matter what I did, I couldn’t fix this error. It asks me to classify my model. I did that too but it still didn’t work.

1 Like

It seems that you need to specify the library name. In many cases, it is automatically recognized…
But leaving that aside, to put it simply, it is currently almost impossible to use models created by general users from the Inference API. This is a change that has occurred in the last six months or so, so in many cases the documentation etc. has not caught up…

---
license: apache-2.0
pipeline_tag: text-classification
library_name: transformers
tags:
- text-classification
- spam-detection
---
1 Like

Thank you for your answer.

After making the edit you mentioned on README.md, Interface API worked. I will also take your warning into consideration.

1 Like

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.