I am working with Pytorch , and sequencetoclassfication. I want to use the embedding pre_trained and use my own forward. How do I do it (namely using the pre-trained weights and the embedding and then moving forwrd with my own code)
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
How can I change the forward function of BertForSequenceClassification | 0 | 1874 | August 14, 2021 | |
Custom class for token classification | 1 | 37 | August 30, 2024 | |
Do we need to load a model twice to get embeddings and probabilities? | 3 | 1447 | December 18, 2021 | |
Create a custom model that works with any pretrained transformer body | 2 | 1239 | May 16, 2025 | |
Embedding layer or last hidden_hidden_state | 0 | 210 | November 1, 2023 |