DPR pretrained context encoder: Unused weight warning

Hi! Working with DPR (dense passage retrieval) and I want to verify that it performs well on the development set downloaded from:

https://dl.fbaipublicfiles.com/dpr/data/retriever/biencoder-nq-dev.json.gz

I initialize the context encoder with:

passage_encoder = AutoModel.from_pretrained("facebook/dpr-ctx_encoder-single-nq-base")

This raises the warning:

Some weights of the model checkpoint at facebook/dpr-ctx_encoder-single-nq-base were not used when initializing DPRQuestionEncoder:
['ctx_encoder.bert_model.encoder.layer.10.attention.output.dense.bias',
'ctx_encoder.bert_model.encoder.layer.11.attention.self.value.bias',
...,
'bert_model.encoder.layer.1.attention.output.dense.bias']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

I just want to evaluate a pretrained model on the dev set. Why am I getting a warning about the weights?

Also, the error message complains about the initialization of DPRQuestionEncoder, whereas I’m initializing the context encoder. So I’m confused!