Use custom model for mask filling using pipeline

I was able to use pipeline to fill-mask task.
Example:

bert_unmask = pipeline('fill-mask', model='bert-base-cased')
bert_unmask("a [MASK] black [MASK] runs along a fence in the grass", top_k=2)

Now I wanted to finetune bert-base-cased in my dataset and use it in the similar manner like above to fill the mask.

tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased)
model = AutoModelForSequenceClassification.from_pretrained("bert-base-uncased", num_labels=2)

progress_bar = tqdm(range(num_training_steps))

model.train()
for epoch in range(num_epochs):
  for batch in train_dataloader:
    batch = {k: v.to(device) for k, v in batch.items()}
    outputs = model(**batch)
    loss = outputs.loss
    loss.backward()

    optimizer.step()
    lr_scheduler.step()
    optimizer.zero_grad()
    progress_bar.update(1)

Here, I have processed data and make a simple finetuning of the pretrained bert. Now I want to use this finetuned model with pipeline to fill-mask task.
I did the following to use this pretrained model:

pipeline('fill-mask', model=model)

but here the fine-tuned model is finetuned for SequenceClassification task and now I want to use it for mask filling task.

Is it possible to use the model fine-tuned for Sequence classification on masked token filling using huggingface? Or how can I finetuned pretrained bert based model for masked token filling task?