And now the models
black-forest-labs/FLUX.1-dev
black-forest-labs/FLUX.1-schnell
stabilityai/stable-diffusion-3.5-large-turbo
aren’t working either—they were some of the few that were still usable.
If Hugging Face is just a platform to make money, that’s fine, but they should say it outright. No more talk of democratization, open source, or any of that, because in that sense, HF is absolute .
Confirmed - even “stabilityai/stable-diffusion-xl-base-1.0” stopped working.
And I’m with javarribas: HF-staff: if this platform turns into something to make money - be honest about that. Willing to pay here - if I get a stable (!!) dev enviroment without breaking changes left right and center.
I understand that servers cost money - but right now HF for me is completely bonkered.
I’m getting this Error An error occurred while fetching the blob while using the model facebook/bart-large-mnli with NodeJS (v20.19.0)
Error: An error occurred while fetching the blob
at innerRequest (/Users/vt/test/node_modules/@huggingface/inference/dist/index.cjs:1737:11)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async zeroShotClassification (/Users/vt/test/node_modules/@huggingface/inference/dist/index.cjs:2235:25)
at async classifyText (/Users/vt/test/test.js:27:22)
at async extractFilterCriteria (/Users/vt/test/test.js:42:24)
at async main (/Users/vt/test/test.js:117:32)
This is how I call the model facebook/bart-large-mnli
I have also tried cURL, and got the 504 Gateway Timeout.
curl https://router.huggingface.co/hf-inference/models/facebook/bart-large-mnli \
-X POST \
-d '{"inputs": "Hi, I recently bought a device from your company but it is not working as advertised and I would like to get reimbursed!", "parameters": {"candidate_labels": ["refund", "legal", "faq"]}}' \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer HF_API_KEY'
This call to sentence-transformers/all-MiniLM-L6-v2 works fine:
Actually, it has nothing to do with blobs, you probably use old version of HF Inference client: Inference API stopped working - #40 by TOOTHED. And yes, new version will cause “No Inference Provider available for model”.