I am working with Pytorch , and sequencetoclassfication. I want to use the embedding pre_trained and use my own forward. How do I do it (namely using the pre-trained weights and the embedding and then moving forwrd with my own code)
Related topics
| Topic | Replies | Views | Activity | |
|---|---|---|---|---|
| How can I change the forward function of BertForSequenceClassification | 0 | 1886 | August 14, 2021 | |
| Bypassing tokenizers | 2 | 423 | November 23, 2020 | |
| Do we need to load a model twice to get embeddings and probabilities? | 3 | 1463 | December 18, 2021 | |
| How to train new token embedding to add to a pretrain model? | 1 | 3672 | January 6, 2021 | |
| Modify bert embeddings | 0 | 382 | January 18, 2022 |