I am trying to use a text classification model for sentiment analysis. I need to share it with another team in a pickl file. Please help me answer my query. I have spent too much time searching. found no help.
Is there a particular reason for why the model needs to be shared as a pickle fie? Transformers has a
save_pretrained method that you can use to save a model locally, but it’ll save it as a
.bin along with the model’s config. Maybe this helps a bit? How to save my model to use it later
I guess you could pickle the state_dict of a model and then reload it manually, but I’m not sure I see the utility in that. Can you share a bit more about what you’re trying to achieve, and whether having a pickle file is a hard requirement?
Thank you for your response, just like normal pickling of a model and sharing it, I was just trying to do the same i.e. to make changes to the model answer and share it as a pickl file only, to be reused by someone else. so that the model, its config variable, and its tokenizer are not explicitly needed to be written. But then I realized it doesn’t worth the effort since we already need to make necessary imports from the transformers too.