Well, i’ve trained a new model on GPT2 and added some tokens to control my generation. It all worked fine while i was using greedy search algorithm, but when i tried to switch for beam search algorithm it started to mess up my tokens, mixing it with regular words. Does anyone have a clue about why is that happening? And how can i solve it?
did you find an answer?