Use EncoderDecoder models for text summarization

:wave: Please read the topic category description to understand what this is all about


Most of the available Transformer models for text summarization are only available for English documents. At the same time, there are now many pretrained BERT-like models in non-English languages. The goal of this project is to explore whether the [EncoderDecoder architecture](Encoder Decoder Models โ€” transformers 4.12.2 documentation) in :hugs: Transformers can be used to create summarization models using just the pretrained weights of encoder-based models.

Your task is to pick a pretrained encoder in a non-English language and train it to summarise texts in that language.


See here for example models that people have fine-tuned using this architecture. You task is to create your very own model with this technique!


Search for summarization datasets on the Hub to get an appropriate corpus for this task


Text summarization is a tricky NLP task, so the performance obtained with these models may not match what is observed for their English couterparts (where much more data is available)

Desired project outcomes

  • Create a Streamlit or Gradio app on :hugs: Spaces that can summarize a document in your chosen language
  • Donโ€™t forget to push all your models and datasets to the Hub so others can build on them!

Additional resources

Discord channel

To chat and organise with other people interested in this project, head over to our Discord and:

Follow the instructions on the #join-course channel. Then join one of the following channels:

  • #encoder-decoder-es channel (Spanish)

Just make sure you comment here to indicate that youโ€™ll be contributing to this project :slight_smile:


Interesting project, I am interested in a Spanish summarizer using Encoder-Decoder model. Anyone else interested in this approach?

1 Like

Hey @edumunozsala, cool to hear that youโ€™re interesting in tackling this project! Iโ€™ve created a Discord channel (see topic description) in case you and others want to use it :slight_smile:

1 Like