When using a hugging face transformer, do I need to tokenize input text for summarization? I have read a text from pdf, which is super long. Can I pass this directly into the transformer for summarization?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Passing list of inputs to tokenize | 1 | 1286 | May 9, 2022 | |
Which summarization model of huggingface supports more than 1024 tokens? Which model is more suitable for programming related articles? | 1 | 1570 | July 31, 2023 | |
Transformers, limiting output to 200 words | 0 | 283 | August 23, 2022 | |
Model to rewrite/summarize a text into a target reading level | 0 | 879 | June 8, 2023 | |
Question on splitting input sequence | 3 | 5288 | June 14, 2022 |