Summary:
My Docker Space builds fine and responds on GET /, but all POST requests to my defined /api/predict-emotion route return a 404 HTML page (not a 500/json). Logs never show POST requests. I havw tried every documented troubleshooting step; please help!
My Setup:
All files (app.py, Dockerfile, requirements.txt, models, etc.) are at repo root
Hi @John6666, thank you so much for your quick reply. I checked and I am POSTing to the correct .hf.space endpoint, still getting 404 HTML. No POST lines in logs. I have rebuilt and even tried a new Space with only minimal app.py. I am thinking if this is an infra/routing bug.
Well, if it’s a bug, I don’t know where to report it. I guess we could just email support, but it would be better to verify the bug openly if possible…
Thank you so much for your speedy and consistent replies. I am trying email support because the hub-docs looks more appropriate for documentation support. Fingers crossed.
@Angelina067 Thank you so much for such a concrete suggestion. Sadly, I have tried everything you said but it’s still not working
INFO:emotion-backend:Model loaded OK.
INFO: Started server process [1]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:7860 (Press CTRL+C to quit)
OMG! So sorry. Its frustrating! Since your server is running fine but Hugging Face is still returning HTML 404s for POST requests, this is definitely a Hugging Face Spaces infrastructure issue.
I think Hugging Face Spaces has routing limitations - it’s designed primarily for GET requests (web interfaces) and doesn’t always handle POST requests to custom API endpoints properly.
May be you should use the Root Path (Easiest Fix), Move your endpoint to the root:
✓ Consider using their hosted inference API rather than a custom Docker space for POST endpoints.
P.s: The issue is that Hugging Face Spaces infrastructure doesn’t properly route POST requests to custom API endpoints. This is a known limitation. The quickest fix is either moving the endpoint to the root path or using a Gradio interface, which Hugging Face Spaces is optimized for.
This isn’t a code problem - it’s a Hugging Face Spaces platform limitation with POST routing.
Emotion Detection - a Hugging Face Space by boloappde I put the new space in public and used Gradio this time. Initially I used Gradio, and moved to Docker because I thought that would be better, but after enuogh troubleshooting I changed my mind. There’s a problem with the current ML model (only giving outputs as “happy” even when it is not!). After fixing it, I will try to connect it with the web app.
Okay so finally I had to just embed the Gradio interface to the web app. However I am extremely happy and grateful to both of you @Angelina067 and @John6666. Thank you so much for being so kind and wonderful.