Mt0 models are text2text-generation models, not text-generation


For some reason pipeline tag for bigscience mt0 models is text-generation when it should be text2text-generation. This is evident from get_task("bigscience/mt0-small") transformers pipelines function invocation (see below). When instantiating the pipeline, not passing “text2text-generation” task param, causes a warning to show. It seems like the pipeline_tag value on the model hub is incorrect.