I am working with Pytorch , and sequencetoclassfication. I want to use the embedding pre_trained and use my own forward. How do I do it (namely using the pre-trained weights and the embedding and then moving forwrd with my own code)
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
How can I change the forward function of BertForSequenceClassification | 0 | 1825 | August 14, 2021 | |
Custom bert embedding cause "RuntimeError: Trying to backward through the graph a second time" | 0 | 912 | March 10, 2023 | |
Using HF to train a custom PyTorch architecture | 0 | 494 | July 29, 2022 | |
Save a Bert model with custom forward function and heads on Hugginface | 1 | 1942 | June 7, 2022 | |
PyTorch Bilinear messing with HuggingFace BERT?! | 0 | 624 | February 22, 2022 |