HuggingFace Inference Endpoints: Pipeline Args

Hi! I’m trying to use the Huggingface Inference Toolkit with a pipeline using the guide below :slight_smile: :

My question is this : I need a token-classification pipeline. I’m able to deploy it successfully, but is there a way to mention the aggregation strategy ?

Thanks!

It using the same paramters as transformers. You can provide a parameters parameter when running inference with aggregation_strategy

Thanks for replying Phil! :slight_smile:

Is this what you mean? ( this didn’t work :confused: )
input_pack = {“inputs”:“Hello welcome to The Olympic Games in London”,
“parameters”:{“aggregation_strategy”:“first”}}

Or passing in the env parameters like HF_TASK

I’ve been running into some of the same things and getting different responses locally vs building to a docker space for api inference, super frustrating.

From what I’m picking up, the tasks have different dictionary entries depending on task type. Typically an input and options. Would love to see the source on this stuff.

Relevant info here