Converting pytorch checkpoints to original roberta pytorch checkpoints

On this thread, https://github.com/pytorch/fairseq/issues/1514, there was a solution to converting fairseq roberta checkpoints to HF pytorch checkpoints. It seems it is straightforward to do the opposite (as it is mentioned on that thread), but any direct help here is appreciated. Basically, I want to convert HF checkpoints to original roberta format to be used with fairseq implementation.

thanks,
Jitendra