When using a hugging face transformer, do I need to tokenize input text for summarization? I have read a text from pdf, which is super long. Can I pass this directly into the transformer for summarization?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Distilling T5-small for summarization | 0 | 464 | May 25, 2022 | |
Is summary of 1024 tokens not useless? | 1 | 676 | July 1, 2022 | |
Can t5 transformer can be used to summarize conversations | 1 | 451 | January 19, 2021 | |
How to utilize a summarization model | 4 | 2419 | February 18, 2021 | |
Which summarization model of huggingface supports more than 1024 tokens? Which model is more suitable for programming related articles? | 1 | 1788 | July 31, 2023 |