`BertEmbeddings` contains positional embedding?


I am using BertEmbeddings from transformers.models.bert.modeling_bert for a custom model.

I wasn’t sure if the output includes the sin/cos positional encoding (Attention Is All You Need), or if I need to add the position encoding myself.

from transformers import BertConfig
from transformers.models.bert.modeling_bert import BertEmbeddings

bert_config = BertConfig("bert-base-uncased")
bert_embeds = BertEmbeddings(bert_config)

text_embeddings = bert_embeds(text_ids)

text_embeddings  # Does this include positional embeddings?
# Or should I do
final_embeddings = text_embeddings + positional_embeddings


Hello there! BertEmbeddings has the positional embeddings for you: transformers/modeling_bert.py at 31d452c68b34c2567b62924ee0df40a83cbc52d5 · huggingface/transformers · GitHub

We use a vanilla nn.Embedding layer instead of the sin/cos positional encoding (more on that here: Why positional embeddings are implemented as just simple embeddings? - #6 by yjernite). If you want to override that and use sinusoidal embeddings though, you can follow the tip here: How to use custom positional embedding while fine tuning Bert - #2 by JinKing

Hope that helps! Cheers!

1 Like

Extremely helpful. Thanks!

1 Like