Hello there,
I was playing around with Inference Endpoints and I’m wondering if you think it could be interesting to add some support to OpenAPI specification.
For example, If you implement a complex custom handler for your endpoint, which takes a bunch of parameters and return a complex json object, it might be useful for users to have a full description of the service specification.
Maybe the inference endpoint could expose an OpenAPI json file included in the repository along the handler.py and requirements.txt files.
What do you think?
Thank you!
Regards
1 Like
How should the service know which parameters are expected if you have a custom handler, which is loaded dynamically. Thats sounds like a lot of overhead for the user who creates the custom handler.
Hey Philipp, thanks for your answer.
off course generate the OpenAPI spec in a dynamic way is completely off the table, I meant that the user who implements the handler could have the possibility to add this OpenAPI json file to describe the service in a standard way.
You are right this means an extra work for the developer because it implies to create this file manually, so it shouldn’t be mandatory but an option. In my opinion it could be helpful for people interested in using the endpoint.
just an idea 
Thanks again