Error: The model weights are not tied. Please use the `tie_weights` method before using the `infer_auto_device` function, even after adding model.tie_weights()

Hi,
I am trying to use ‘Accelerate’ for running inference on a PMC-Llama model using a GPU and CPU.
Here is my code,

import transformers
from accelerate import init_empty_weights
from transformers import AutoConfig, AutoModelForCausalLM

config = AutoConfig.from_pretrained("chaoyi-wu/PMC_LLAMA_7B")
with init_empty_weights():
    model = AutoModelForCausalLM.from_config(config)

from accelerate import infer_auto_device_map, init_empty_weights
model.tie_weights()
device_map = infer_auto_device_map(model)
# Getting the warning: The model weights are not tied. Please use the `tie_weights` method before using the `infer_auto_device` function.

I am not able to understand why I am getting this error even though I have added the statement, ‘model.tie_weights()’
Thank you. Any help would be highly appreciated.

1 Like

I had this same issue, if anything could please answer the OP question that would be super helpful!

@abhibha
upgrading to transformers >= 4.28.1
!pip install transformers==4.28.1

@Falah Thanks! I am already using transformers version - 4.30.2 . And still am getting the error.

@abhibha
u can resolve it in google colab

Thanks @Falah , I tried running my code without the ‘model.tie_weights()’ line. It still doesn’t give the warning. I tried running on google colab. Should I ignore the warning?

@abhibha
If you are sure that the warning was triggered by the presence of model.tie_weights(), and the warning disappears when that line is removed, then it’s likely that the warning is related to the usage of model.tie_weights() in your specific context.

Without knowing the specific code and the warning message you received, it’s challenging to determine the exact cause of the warning and whether it is safe to ignore. However, I can provide some general guidance:

  1. Understand the purpose of model.tie_weights():
  2. Check the documentation and version compatibility:
    Make sure to refer to the documentation of the library or framework you are using for model.tie_weights().
  3. Consider the impact on your specific use case:
    Depending on your specific use case, tying weights may or may not be necessary. If you are achieving the desired results without using model.tie_weights(), and the warning disappears after removing it, it might be acceptable to leave it out. However, it’s crucial to consider the implications on your model’s performance and training process.
  4. Test and validate your results:
    If you decide to proceed without using model.tie_weights(), thoroughly test your model and validate the results to ensure that it performs as expected and achieves the desired outcomes.
1 Like

Why post answers from chatGPT?
This answer clearly contains no crucial information and it is not the solution to this error.

1 Like

@NitzanBar
AI Models are the only solution to all programming problems for us as system developers. because traditional methods are slow, boring, and annoying, and we do not reach results in a short time
In your opinion, what is the solution to the problem!!! It was solved in Google Colab previously…
I need a solution from you, please benefit from and share it