|
Inference Endpoints / Model choices / Help
|
|
1
|
42
|
July 10, 2025
|
|
Created Huggingface Inference API for ASR with whisper/large-v3 model, but after API is created, In playgroud itself throwing undefined error
|
|
2
|
64
|
July 8, 2025
|
|
Memory Alignment Fragmentation Fix for Transformers Inference (YOLO/BERT) — Kernel-Level + Runtime Observations
|
|
0
|
117
|
July 7, 2025
|
|
Scheduling failure: unable to schedule
|
|
5
|
54
|
June 27, 2025
|
|
Huggingface token usage for routed requests for a custom provider
|
|
0
|
53
|
June 26, 2025
|
|
Inference result not aligned with local version of same model and revision
|
|
15
|
94
|
June 26, 2025
|
|
HF Inference API last few minutes returns the same 404 exception to all models
|
|
45
|
2445
|
June 25, 2025
|
|
Requirements for Hosting LLM via Inference Endpoints
|
|
2
|
72
|
June 13, 2025
|
|
Inference API stopped working
|
|
50
|
6338
|
June 8, 2025
|
|
404 Error When Calling the Hugging Face Inference API via Dify
|
|
3
|
293
|
June 2, 2025
|
|
How to run agents from `smolagents` locally?
|
|
4
|
1072
|
May 27, 2025
|
|
Rejected Endpoint
|
|
1
|
54
|
May 20, 2025
|
|
Has inference API stopped returning text embeddings?
|
|
1
|
89
|
May 17, 2025
|
|
Inference API Rate Limits
|
|
1
|
478
|
May 16, 2025
|
|
Cerebras Inference Error
|
|
0
|
79
|
May 12, 2025
|
|
Inference endpoint taking forever to initialize
|
|
1
|
75
|
May 12, 2025
|
|
Unable to get inference results after deploying model to Inferende Endpoints
|
|
0
|
22
|
May 8, 2025
|
|
Cannot use Inference Provider. 429 error. First time usage
|
|
6
|
94
|
May 5, 2025
|
|
Cannot execute any model with my API Token, models are timed out
|
|
6
|
2931
|
May 1, 2025
|
|
HFAPIModel pricing
|
|
2
|
60
|
April 30, 2025
|
|
Error 402 while using smolagents with a valid token
|
|
7
|
82
|
April 30, 2025
|
|
RuntimeError: The size of tensor a (48) must match the size of tensor b (64) at \nnon-singleton dimension 0"}
|
|
1
|
310
|
April 29, 2025
|
|
Inference API error with Whisper, return_timestamps parameter
|
|
13
|
1056
|
April 25, 2025
|
|
Constant 503 error for several days when running LLAMA 3.1
|
|
5
|
459
|
April 25, 2025
|
|
Inference API returns 504 error for Llama-3.2-3B-Instruct & google/gemma-2-2b-it
|
|
3
|
48
|
April 21, 2025
|
|
Error 400 - when I update endpoints to lastest version
|
|
3
|
77
|
April 20, 2025
|
|
Inference benchmark (vllm with nginx)
|
|
1
|
145
|
April 17, 2025
|
|
Too large to be loaded automatically (16GB > 10GB) issue with QWEN 2.5 VL 7B
|
|
2
|
138
|
April 15, 2025
|
|
Inference API cost changed for meta-llama-3.3-70b?
|
|
3
|
343
|
April 13, 2025
|
|
Tool calling gets stuck in an infinite loop
|
|
2
|
423
|
April 12, 2025
|