I have taken specific word embeddings and considered bert model with those embeddings
self.bert = BertModel.from_pretrained(‘bert-base-uncased’)
self.bert(inputs_embeds=x,attention_mask=attention_mask, *args, **kwargs)
Does this means I’m replacing the bert input embeddings(token+position+segment embeddings)
How to consider all embeddings i.e.,(token+position+segment+custom embeddings)