`BertEmbeddings` contains positional embedding?

Hello there! BertEmbeddings has the positional embeddings for you: transformers/modeling_bert.py at 31d452c68b34c2567b62924ee0df40a83cbc52d5 路 huggingface/transformers 路 GitHub

We use a vanilla nn.Embedding layer instead of the sin/cos positional encoding (more on that here: Why positional embeddings are implemented as just simple embeddings? - #6 by yjernite). If you want to override that and use sinusoidal embeddings though, you can follow the tip here: How to use custom positional embedding while fine tuning Bert - #2 by JinKing

Hope that helps! Cheers!

1 Like