'NoneType' Object Error and Token Authorization Issue in OpenAI API Integration

Hello,

I hope this message finds you well. I am currently developing a project that integrates both RunPodā€™s API and Hugging Faceā€™s open-source tools. While implementing the OpenAI API, I encountered the following issues:

Issue Description:

  1. Previous Error: Initially, I was receiving a 500 Internal Server Error. After consulting with RunPodā€™s support team, this issue was resolved.
  2. Current Error: My model, MODEL_NAME = 'meta-llama/Llama-3.1-8B-Instruct', is gated. I suspect there might be an issue with my Hugging Face token. When executing the following code:

python

Kodu kopyala

response = client.chat.completions.create(
    model=model_name,
    messages=input_messages,
    temperature=temperature,
    top_p=0.8,
    max_tokens=2000,
)
print(response)

I receive the following error message:

csharp

Kodu kopyala

TypeError: 'NoneType' object is not subscriptable

Troubleshooting Steps Taken:

  • Validation: Verified that the API key, model name, and input messages are correctly specified.
  • Response Check: Printed the response object to inspect its contents; it returns None.
  • Conditional Checks: Implemented checks to handle None values gracefully, but the issue persists.
  • Token Verification: Checked the validity of my Hugging Face token using the following code:

python

Kodu kopyala

import requests

token = "YOUR_HUGGING_FACE_TOKEN"
headers = {"Authorization": f"Bearer {token}"}
response = requests.get("https://huggingface.co/api/whoami-v2", headers=headers)

if response.status_code == 200:
    print("Token is valid.")
else:
    print("Token is invalid or unauthorized access.")

The output indicated: ā€œToken is invalid or unauthorized access.ā€

Additional Information:

  • Environment: Utilizing RunPodā€™s API services alongside Hugging Faceā€™s open-source tools.
  • Objective: Seeking guidance on resolving the ā€˜NoneTypeā€™ error and addressing the token authorization issue to successfully retrieve and process responses from the OpenAI API.

I have reviewed similar issues and discussions, such as the one on the Hugging Face forums titled ā€œInvalid tokenpassed?ā€, but have not found a solution that resolves my specific problem.

Could you please provide insights or recommendations to address these issues? Your assistance would be greatly appreciated.

Thank you for your support.

1 Like

hi @Nillly

Youā€™re giving a token that starts with ā€˜hf_ā€™, right?

import requests

token = "YOUR_HUGGING_FACE_TOKEN"
headers = {"Authorization": f"Bearer {token}"}
response = requests.get("https://huggingface.co/api/whoami-v2", headers=headers)

if response.status_code == 200:
    print("Token is valid.")
else:
    print("Token is invalid or unauthorized access.")

And regarding browsing to https://huggingface.co/api/whoami-v2, can you check and see if it displays your username?

1 Like

Thanks Mahmut, ı handled a problem by myself (I think I was setting the new api token and restart all of my programs then after that everything has been perfect. Also I tried this solution as well ) I have passing the embeddings and then I got the same error again ā€¦ When ı write this code output = embedding_client.embeddings.create(input=user_prompt, model = ā€œBAAI/bge-small-en-v1.5ā€) I am getting this error " AuthenticationError: Error code: 401 - {ā€˜errorā€™: {ā€˜messageā€™: 'Incorrect API key provided:" after that I solved the problem from restricted settings in my Infinity Vector Embeddings server I ticked the read/write. This time Iā€™ve got a new problem 403 error permission deniedā€¦ Why did I get the same error when I give the new env url. I donā€™t understand that.

1 Like

I solved that againā€¦ Server wanted to me changing the first request inputs. This is so scary . I hope this solution helps the other people :blush:

1 Like

This topic was automatically closed 12 hours after the last reply. New replies are no longer allowed.