How do GPT2 pretrained models allow custom hyperparams?

When creating a pretrained GPT2 model such as GPT2LMHeadModel(config).from_pretrained(‘gpt2’),
we can specify a config with a custom params in the config such as vocab size, n_positions, n_embd, activation_function, n_head.

How is it possible to use custom values for these on a pretrained model. For example, how could we choose the number of attention heads after the model has already been trained? Are there simply a large number of pretrained models corresponding to each config?