I was using the model with the inference api , Initially it was working fine but after 2-3 hours in the evening i started to get this error.
Hoe can i resolve and use the model with free Access token.
The same Happened with Qwen 2.5 VL 7B and i think they have removed it from hugging face serverless service via hugging face Inference API , try different provider for the API call or use a dedicated server (this worked for me).