Dedicated endpoint not matching OpenAI specification

Hi,

I’m using a dedicated inference endpoint hosting a fine-tuned Llama3-8B-Instruct model. I am trying to use the Open AI API:

https://[endpoint-id].us-east-1.aws.endpoints.huggingface.cloud/v1/chat/completions

However, when I specify a logit_bias, (e.g. {"50256": -100}) I get an error with status 422 and body: “Failed to deserialize the JSON body into the target type: logit_bias: invalid type: map, expected a sequence at line 15 column 18”.

Is there any chance that this could be fixed so that it’s compliant with the Open AI API specification? And if not, what is the expected format for this parameter? Thanks!