When using a hugging face transformer, do I need to tokenize input text for summarization? I have read a text from pdf, which is super long. Can I pass this directly into the transformer for summarization?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Passing list of inputs to tokenize | 1 | 1315 | May 9, 2022 | |
Which summarization model of huggingface supports more than 1024 tokens? Which model is more suitable for programming related articles? | 1 | 1661 | July 31, 2023 | |
Transformers, limiting output to 200 words | 0 | 289 | August 23, 2022 | |
Transformers for regression | 0 | 548 | December 7, 2022 | |
Model to rewrite/summarize a text into a target reading level | 1 | 939 | March 15, 2025 |