How the open source models released by vendors and made available in huggingface

From python and torch, I know that a model can be saved. Either only parameters or whole model can be saved.

For example recently Meta released Llama2. What did they actually release? Only model parameters or whole model?
If they released only model parameters, somebody in HF must have implemented the corresponding model in huggingface transformers library. If done so, how does that person know what to implement?

What do you mean by “only parameters”? The structure is of course included in what they publish, and also the tokenizer.