Hey community, i hope you’re doing fine.
I’m new to the huggingface framework, so my question is if there any way to download hugging face models(like bert…) without it’s pretrained weights? the architecture only? i’m using pytorch
Thank you so much.
you can construct one using your own defined configuration.
from transformers import BertForMaskedLM
model = BertForMaskedLM(config=config)
where in the config variable, you provide the parameters of the model - the no. of heads for attention, FCN size etc.
So you can train from scratch, but you won’t need to download its pre-trained weights and use BERT however you wish.
@Neel-Gupta is there a good hello world doing that?
@Neel-Gupta thank you so much.