Some Roberta weights are not initializing from the checkpoint

I got this warning after loading Roberta model from the checkpoint
Loading the trained model this way

model = RobertaModel.load_from_checkpoint(
            checkpoint_path=<path>
        )
        model.freeze()

Warning

Some weights of RobertaModel were not initialized from the model checkpoint at roberta-base and are newly initialized: ['roberta.pooler.dense.bias', 'roberta.pooler.dense.weight']

You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.