Error while loading the model using safe tensors

Getting the following error while executing the following code :
pipe = pipeline(“text-generation”, model=“HuggingFaceH4/starchat-beta”, torch_dtype=torch.bfloat16, device_map=“auto”)

Error:
Traceback (most recent call last):
File “”, line 1, in
File “/data/user/.local/lib/python3.8/site-packages/transformers/pipelines /init.py”, line 779, in pipeline
framework, model = infer_framework_load_model(
File “/data/user/.local/lib/python3.8/site-packages/transformers/pipelines /base.py”, line 262, in infer_framework_load_model
model = model_class.from_pretrained(model, **kwargs)
File “/data/user/.local/lib/python3.8/site-packages/transformers/models/au to/auto_factory.py”, line 471, in from_pretrained
return model_class.from_pretrained(
File “/data/user/.local/lib/python3.8/site-packages/transformers/modeling_ utils.py”, line 2795, in from_pretrained
) = cls.load_pretrained_model(
File "/data/user/.local/lib/python3.8/site-packages/transformers/modeling
utils.py", line 3109, in load_pretrained_model
state_dict = load_state_dict(shard_file)
File "/data/user/.local/lib/python3.8/site-packages/transformers/modeling
utils.py", line 440, in load_state_dict
return safe_load_file(checkpoint_file)
File “/data/user/.local/lib/python3.8/site-packages/safetensors/torch.py”, line 261, in load_file
result[k] = f.get_tensor(k)
RuntimeError: Viewing a tensor as a new dtype with a different number of bytes p er element is not supported.