Elegant way to load and save a pretrained model as part of other model?

Hi, I am wondering is there an elegant way to load and save a pretrained_model ( e.g. BERT) as part of my own model ?
For example, this is my own model, and I only want BERT to be a embedding layer,

class MyModel(torch.nn.Module):
  def __init__(self,config):
    self.embedding = BertModel.from_pretrained(config.bert_path)
    self.other = OtherModel(config)

I can initialize like this

model = MyModel(config) ## where bert_path is on the config

After training, how can I save this model ? Of course I can do something like this.

torch.save(model.state_dict(),save_path)

But when later, if I want to reuse the model. I have to do something like this:

model = MyModel(config)
model.load_state_dict(torch.load(save_path))

I have to load model from config and then reload the model weight. Is there some elegant ways to do something like this ?

model = MyModel.from_pretrained(config)
model.save_pretrained(save_path)

I know I can define MyModel by subclass PretrainedModel to have from_pretrained and save_pretrained method, but this is a little bit hacky and I cann’t find a easy way to do it. Could you offer some good ideas ? Thanks so much.