Custom modification on transformers

Are there any resources in huggingface on adding or removing layers in an already build model like bert or roberta.
I want to replace some layers in a transformer model for my project. for example changing the self attention layer to some other kind of attention
i also want to add my own layer inside the transformer as well.

Thank you.

Hi,

The Transformers library is not really aimed for this use case. It’s not meant to be a modular toolbox but rather aimed at people who want to use and fine-tune pre-trained models.

Of course, you could fork the Transformers library and tweak modeling_bert.py yourself, but if you want to hack around I’d recommend checking out other projects such as the ones from Phil Wang or GitHub - facebookresearch/xformers: Hackable and optimized Transformers building blocks, supporting a composable construction..