How to map input ids to a limited Embedding indexes

I have an embedding with limited size (say 5)

        self.embedding = torch.nn.Embedding(length,embedding_dim)

I receive input ids like (7, 18, 6, …) as a pytorch tensor. However the embedding for 7 is in the first index of embedding, for 18 it is in second row, etc.

I want a map from these numbers to 1,2, 3… to access stored value in embedding.
It seems I can’t use a dictionary as follows

   def forward(self,prompt_token_ids,pids=None):
        prompt_token_ids = [self.id_map[x] for x in prompt_token_ids]
        return self.embedding(prompt_token_ids)

How can I do these mappings for tensors?