How to conquer "write a preprocessing function that works on any of the GLUE tasks."?

Hi experts,

I am stuck at “writing a preprocessing function that works on any of the GLUE tasks.” How to adjust the following function to support various types of dataset?
def tokenize_function2(example):
return tokenizer(example[“sentence”], truncation=True)

Thank you.

Turka