Error while Fine tuning Zero shot classification model fb-bart-large-mnli

I am trying to train the fb-bart-large-mnli using the NLI techniques defined here

After the NLI transformation my input dataset shape looks like this:

{‘labels’: 2, ‘input_ids’: [0, 44758, 3457, 13, 5, 1263, 829, 31, 5, 1263, 8401, 4001, 438, 34, 5, 511, 7390, 2, 2, 713, 1246, 16, 4287, 92, 3457, 4, 2], ‘attention_mask’: [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1], ‘input_sentence’: ‘Create tests for the response received from the response whihc has the following formatThis example is Add new tests.’}

The hyperparameters and the model parameters are given below:
model_name=‘facebook/bart-large-mnli’

hyperparameters, which are passed into the training job

hyperparameters={#‘epochs’: 1,
#‘train_batch_size’: 8,
‘do_train’ : True,
‘model_name’:model_name,
‘task_name’: ‘mnli’,
#‘output_data_dir’: ‘/opt/ml/output/data/’,
‘output_dir’: ‘/opt/ml/output/’,
‘ignore_mismatched_sizes’: True,
‘overwrite_output_dir’ : True
}

git_config = {‘repo’: ‘https://github.com/huggingface/transformers.git’,'branch’: ‘v4.26.0’}

creates Hugging Face estimator

huggingface_estimator = HuggingFace(

entry_point='run_glue.py',
source_dir='./examples/pytorch/text-classification',
instance_type='ml.p3dn.24xlarge',
instance_count=1,
role=role,
git_config=git_config,
transformers_version='4.26.0',
pytorch_version='1.13.1',
py_version='py39',
hyperparameters = hyperparameters,
distribution = distribution

)

I get the following error when I run this training on AWS sagemaker:

Dropping the following result as it does not have all the necessary fields:
[1,mpirank:0,algo-1]:{‘task’: {‘name’: ‘Text Classification’, ‘type’: ‘text-classification’}, ‘dataset’: {‘name’: ‘GLUE MNLI’, ‘type’: ‘glue’, ‘config’: ‘mnli’, ‘split’: ‘train’, ‘args’: ‘mnli’}}

And finally no model.tar.gz is created

Could somone please help me or direct me to some documentation to solve this? Thanks in advance