Using T5-Base via Inference API

Hi,

I’m trying to use T5-base model for summarization task using Inference API. I added the T5-specific prefix "summarize: " to the text. However, the model is returning translation_text as output instead of summary_text.
(I was able to use t5-base model for summarization using a model and tokenizer as described here: Summary of the tasks — transformers 4.7.0 documentation)

Hello @mohsenalam. Did you solve your issue?
On my side, I would like to use Inferenec API for T5 base as well.
I opened this thread: How to get Accelerated Inference API for T5 models?