Restric vocab size of pretrained transformer

Hello,
for a personal project, I’d like to grab a pretrained transformer model to make next word predictions. The thing is that for this project, there’s only 200 words that can be part of the input. That is, sentences will be written with a combination of those 200 words, and the model should predict the next word in the given sentence with one of those 200 words.
There’s no corpus of sentences of composed of these 200 words to train on. So I was looking for a way of doing something like “grab pretrained transformer that can predict words, restrict the prediction to only 200 pre-defined words, use it”.

Is there a way to do this? can someone point to a resource?

Thanks for your time,
Fran.