I was workinh with meta-llama-3-8b-instruct with my free plan and the model was working fine. I upgraded the account to pro now the model sends empty responses. i do not know why? I even created a new token with “WRITE” permission
I was workinh with meta-llama-3-8b-instruct with my free plan and the model was working fine. I upgraded the account to pro now the model sends empty responses. i do not know why? I even created a new token with “WRITE” permission
@meganariley a Serverless Inference API issue but also a paid service issue…