HuggingFace Inference Endpoints: Pipeline Args

I’ve been running into some of the same things and getting different responses locally vs building to a docker space for api inference, super frustrating.

From what I’m picking up, the tasks have different dictionary entries depending on task type. Typically an input and options. Would love to see the source on this stuff.

Relevant info here