Can we resize embedding with embedding weighted initialized differently?

When we add new tokens, this method automatically adds embedding using torch nn.Embedding.

The documentation says the resized embeddings are nn. Embedding, which said they by default initialize weights from N(0, 1) ( But I have checked that the resized embedding weights are almost N(0, 0.01) or N(0, 0.02)? Can I check the true distribution of resized embedding weights ?

If I want the embedding weights initialized differently, how can I achieve that efficiently?

1 Like