I am working with Pytorch , and sequencetoclassfication. I want to use the embedding pre_trained and use my own forward. How do I do it (namely using the pre-trained weights and the embedding and then moving forwrd with my own code)
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
How can I change the forward function of BertForSequenceClassification | 0 | 1881 | August 14, 2021 | |
Bypassing tokenizers | 2 | 415 | November 23, 2020 | |
Do we need to load a model twice to get embeddings and probabilities? | 3 | 1458 | December 18, 2021 | |
How to train new token embedding to add to a pretrain model? | 1 | 3657 | January 6, 2021 | |
Loading pretrained weights into model for sequence classifcation | 2 | 485 | July 22, 2020 |