LIME for text: does it work for bigram instead unigram

Hi,

I am trying to explain predictions from a fine-tuned bert model for a binary text classification problem using LIME.

LIME allows us to extract the top n features or words most relevant in the sentence. I was wondering if we can change the LIME explainer in some way to obtain both bi-gram and uni-gram features/words?

And other than LIME and SHAP, are there any similar explainers available to understand the predictions of a BERT model better?

Thank you so much!!!