Hi, everyone. I need some help. I have been developing the Flask website that has embedded one of Transformer’s fine-tuned models within it. I fine-tuned the model with PyTorch. I’ve tested the web on my local machine and it worked at all.
I used fine-tuned model that I’ve already saved the weight to use locally, as pictured in the figure below:
The saved results contain:
Then, I tried to deploy it to the cloud instance that I have reserved. Everything worked well until the model loading step and it said:
OSError: Unable to load weights from PyTorch checkpoint file at <my model path/pytorch_model.bin>. If you tried to load a PyTorch model from a TF 2.0 checkpoint, please set from_tf=True.
I’ve searched around the internet to solve it but still nil. Can I get some enlightenment?
By the way, I’m using Ubuntu 18.04 instance and the environments that I’m used are:
- torch 1.7.0
- transformers 3.5.1
Thank you before!