Classes label encoding order produce different models

I use to encoded my labels using their numbers, eg: -1, 10, 13 etc.
Then I tried to use their class names, eg: ‘other’, ‘finance’, etc.

This makes their encoded labels be different, because they encoded in an alphabetical order., which shouldn’t make the training behave differently (the class numbers and the class names have correct match).

But it seems that the trainer somehow make use of the encoded label which makes it produce different models.

I’m using nn.CrossEntropyLoss(weight=self.labels_weights) with label weights.

Whats the reason for this behaviour?

This makes their encoded labels be different, because they encoded in an alphabetical order., which shouldn’t make the training behave differently (the class numbers and the class names have correct match).

Can you try using the same order as the numbers instead of alphabetical order ?

yeah, I tried it-I added as a prefix the numbers: 0,1,2,3 …
to make it the same order as the original one, and it produce exactly the same original results!