How to fine tune BertForSequenceClassification with PEFT?

First I tried:

from transformers import AutoTokenizer, AutoConfig, AutoModelForSequenceClassification
from peft import (
    get_peft_config,
    get_peft_model,
    get_peft_model_state_dict,
    set_peft_model_state_dict,
    PeftType,
    PeftConfig,
)
checkpoint = "google/bert_uncased_L-4_H-256_A-4"

model = AutoModelForSequenceClassification.from_pretrained(
    checkpoint,
    load_in_8bit=True, 
    device_map='auto',
    num_labels=len(item), 
    )

But BertForSequenceClassification has no device_map… ok, I tried to use LORA without bitsandbytes.

But I also couldn’t do it:

peft_config = PeftConfig(PeftType.LORA, task_type="SEQ_CLS", base_model_name_or_path="BertForSequenceClassification")
model = AutoModelForSequenceClassification.from_pretrained(
    checkpoint,
    num_labels=len(item), 
    )
model = get_peft_model(model, peft_config)

Now the error is that 'PeftConfig' object has no attribute 'target_modules'

Even the collab in the PEFT blog post didn’t work for me because bitsandbytes expects a different CUDA version.

Is there a place with working examples?