Hi @induveca ,
Why would you say the endpoint is useless without being able to specify
max_length ? Models are conditioned to summarize, so the output should work.
Anyway, you can still specify
max_length even if these arguments are not documented (they are not documented which means behavior is subject to change in the future). They are specified in tokens length, which is really hard to gauge if you are only handling text.
If you are using those, expect that the output might be cut mid sentence as you are basically forcing the model to stop outputting tokens even if it had still tokens to say.