For a particular task, I need to create a unique embedding for a given word. BERT models can’t do this because the embedding for the word will vary based on it’s context. However, is there a way to use these embeddings to create a representations of the word that will maintain it’s semantic relation to other words? For example, if I take the mean of all the embedding representations of ‘king’ will that satisfy semantic relations to a similarly generated vector for queen? Any ideas will be greatly appreciated.
I think simply using the word embedding layer initialized in Bert here is one way to do that. Other ways are using things like word2vec, glove, fasttest, etc.