How to extract pytorch model from transformer pretrained model

Really thanks for such a good Projects .

I am involved in one issue when trying to develop new Web Application which uses stack Django RQ for maintaining workers . Problem I am facing is that I am not able to share model to multiple process . I even used

set_start_method(‘spawn’, True)

Add support for adding cuda IPC . By allowing to add share_memory to model
But this is of no use . Then I able to find solution of adding share_memory to model so that GPU memory which contains model can be shared to different processes without having issue of

RuntimeError: Cannot re-initialize CUDA in forked subprocess. To use CUDA with multiprocessing, you must use the ‘spawn’ start method

It will be really helpful if you guide me how to add share_memory to core PyTorch model of nn.Module type . In transformer model.