Although text generation for “Llama-3.2-3B-Instruct” works without any prompt template, however, with template it fails with 503 error.
Any idea what is going wrong here?
I don’t think there is a problem on your end. There was a major outage a few days ago, and it seems that the Inference API for that model has been turned off since then.