When we add new tokens, this method automatically adds embedding using torch nn.Embedding.
The documentation says the resized embeddings are nn. Embedding, which said they by default initialize weights from N(0, 1) (https://pytorch.org/docs/stable/generated/torch.nn.Embedding.html). But I have checked that the resized embedding weights are almost N(0, 0.01) or N(0, 0.02)? Can I check the true distribution of resized embedding weights ?