When using a hugging face transformer, do I need to tokenize input text for summarization? I have read a text from pdf, which is super long. Can I pass this directly into the transformer for summarization?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Distilling T5-small for summarization | 0 | 462 | May 25, 2022 | |
Which summarization model of huggingface supports more than 1024 tokens? Which model is more suitable for programming related articles? | 1 | 1775 | July 31, 2023 | |
Is summary of 1024 tokens not useless? | 1 | 673 | July 1, 2022 | |
What are the best transformers for summarization of big texts in pt-BR? | 0 | 333 | December 21, 2022 | |
Summarization pipeline | 0 | 198 | October 17, 2023 |