Hyperparameter config persistence

I am interested in replication of models, and to this end I am looking for information with regard to training parameters of a model of interest. In several repositories, I find a file named config.json which oftentimes provide a spectrum of parameters for the model but most of the time there seems to be no information for example the optimizer that has been used and its parameters. Is there a standard or suggested way of Saving such information in Huggingface ?