Unable to access public model - status 401

i build .net code to access sentence-transformers/all-MiniLM-L6-v2.
while running, i get the following error:
{StatusCode: 401, Unauthorized]

although token is right , it has read permission
using (var client = new HttpClient())
{
client.DefaultRequestHeaders.Add(“Authorization”, “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”);

var requestContent = new
{
    inputs = new string[] { text1, text2 }
};

var response = await client.PostAsync("https://api-inference.huggingface.co/models/sentence-transformers/all-MiniLM-L6-v2",
    new StringContent(Newtonsoft.Json.JsonConvert.SerializeObject(requestContent), Encoding.UTF8, "application/json"));

if (!response.IsSuccessStatusCode)
{
    Console.WriteLine("Error: " + response.ReasonPhrase);
    return -1;
}

}

appreciate your support

1 Like

If you are getting a 401 error even though the token is correct, it is usually because of a network setting such as SSL, or because a different token that has been cached is being used somewhere.

If it is a completely server error, I think it is more likely that a non-401 error will be displayed.

i checked links you put, but i already assure that token is working fine, this is public repository

1 Like

If you access the public repo, with a token, you may get an error.

I think the way tokens are handled has changed recently, and the timing and conditions for getting an error have changed a little…

do you mean that i have to subscribe for one of plan so that i can get access ?

1 Like

Even if you don’t have a paid plan, you should be able to use that model.

Oh no, you’ve leaked a token.:scream: I’ve just noticed. You should invalidate that token. You can create as many tokens as you like, so there’s no need to worry, but it’s dangerous to leave a leaked token alone.

I tried it out a bit. To cut a long story short, none of the inference APIs for sentence_similarity work properly. It doesn’t work with or without tokens, or with or without the header itself. It just doesn’t work.

HF_TOKEN = "hf_my_valid_pro_token"

import requests

API_URL = "https://router.huggingface.co/hf-inference/v1"
headers = {"Authorization": f"Bearer {HF_TOKEN}"}

def query(payload):
    response = requests.post(API_URL, headers=headers, json=payload)
    return response.json()
    
output = query({
    "inputs": {
    "source_sentence": "That is a happy person",
    "sentences": [
        "That is a happy dog",
        "That is a very happy person",
        "Today is a sunny day"
    ]
},
})

print(output)
# {'error': 'Not allowed to POST v1 for provider hf-inference'}

from huggingface_hub import InferenceClient

client = InferenceClient(
    provider="hf-inference",
    api_key=HF_TOKEN
)

result = client.sentence_similarity(
    model="sentence-transformers/all-MiniLM-L6-v2",
    #model="BAAI/bge-m3", # 503 error
    #model="Qodo/Qodo-Embed-1-1.5B", # 503 error
    sentence="That is a happy person",
    other_sentences=[
        "That is a happy dog",
        "That is a very happy person",
        "Today is a sunny day"
    ],
)

print(result)
#Bad request:
#Input should be a valid dictionary or instance of SentenceSimilarityInputsCheck: received `None` in `parameters`

I cut it short but on another way.
i made semantic similarity work in python , then call it from .net using REST.
it works fine now :slight_smile:

1 Like