Issue with Testing Multiple Models in Zero-Shot Classification: Same Model Loads Repeatedly?

Hi everyone,

I’m testing some zero-shot classification models on a dataset using the following configuration:

possible_models = [
    "vicgalle/xlm-roberta-large-xnli-anli", 
    "MoritzLaurer/mDeBERTa-v3-base-mnli-xnli",
    "joeddav/xlm-roberta-large-xnli",
    "MoritzLaurer/multilingual-MiniLMv2-L6-mnli-xnli",
    "MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7",
]

categories = [
    'Carrot',
    'Potato',
    'Tomato',
    'Onion',
    'Broccoli',
    'Pepper',
    'Cucumber',
    'Spinach',
    'Zucchini',
    'Pumpkin',
    'Lettuce',
    'Corn',
]

data = []

for model in possible_models:
    print("*"*20, model, "*"*20)
    classifier = pipeline(
        "zero-shot-classification",
        model=model,
        device=0
    )
    model_result_data = []
    for term in search_terms:
        result = classifier(term, candidate_labels=categories)
        top_label = result["labels"][0]
        top_score = result["scores"][0]
        model_result_data.append(top_label)

While executing the script, I observe the following outputs in my terminal:

******************** vicgalle/xlm-roberta-large-xnli-anli ********************
Device set to use mps:0
******************** emrecan/bert-base-multilingual-cased-allnli_tr ********************
******************** MoritzLaurer/mDeBERTa-v3-base-mnli-xnli ********************
Device set to use mps:0
Device set to use mps:0
******************** vicgalle/xlm-roberta-large-xnli-anli ********************
Device set to use mps:0
******************** emrecan/bert-base-turkish-cased-allnli_tr ********************
Device set to use mps:0
******************** MoritzLaurer/mDeBERTa-v3-base-mnli-xnli ********************

The issue I’m facing is that instead of loading each model listed in possible_models, the script seems to load the same model multiple times. This behavior suggests that some parallel process in the background might be causing interference.

Has anyone else faced a similar issue or have any suggestions on how to ensure that the correct models load sequentially without repeating?
• Transformers version: 4.47.1

Any help would be greatly appreciated!

1 Like

hi @oguzmes

Are you certain that you’re executing the code you’ve provided? Could you please try running this simplified version? It works as expected on my end.

from transformers import pipeline


possible_models = [
    "vicgalle/xlm-roberta-large-xnli-anli", 
    "MoritzLaurer/mDeBERTa-v3-base-mnli-xnli",
    "joeddav/xlm-roberta-large-xnli",
    "MoritzLaurer/multilingual-MiniLMv2-L6-mnli-xnli",
    "MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7",
]

categories = [
    'Carrot',
    'Potato',
    'Tomato',
]

search_terms = ["orange","red","yellow"]

for model in possible_models:
    print("*"*20, model, "*"*20)
    classifier = pipeline(
        "zero-shot-classification",
        model=model,
        device=0
    )
    for term in search_terms:
        result = classifier(term, candidate_labels=categories)
        print(result)
******************** vicgalle/xlm-roberta-large-xnli-anli ********************
Device set to use cuda:0
{'sequence': 'orange', 'labels': ['Carrot', 'Tomato', 'Potato'], 'scores': [0.36940810084342957, 0.33087024092674255, 0.2997216582298279]}
{'sequence': 'red', 'labels': ['Carrot', 'Tomato', 'Potato'], 'scores': [0.37238937616348267, 0.33551305532455444, 0.2920975685119629]}
{'sequence': 'yellow', 'labels': ['Carrot', 'Potato', 'Tomato'], 'scores': [0.3804394006729126, 0.33429375290870667, 0.2852668762207031]}
******************** MoritzLaurer/mDeBERTa-v3-base-mnli-xnli ********************
Device set to use cuda:0
{'sequence': 'orange', 'labels': ['Carrot', 'Tomato', 'Potato'], 'scores': [0.5049746036529541, 0.2485475242137909, 0.24647784233093262]}
{'sequence': 'red', 'labels': ['Carrot', 'Tomato', 'Potato'], 'scores': [0.46467289328575134, 0.29330772161483765, 0.24201945960521698]}
{'sequence': 'yellow', 'labels': ['Carrot', 'Tomato', 'Potato'], 'scores': [0.4435359835624695, 0.28285351395606995, 0.2736104726791382]}
******************** joeddav/xlm-roberta-large-xnli ********************

1 Like