I am trying to use ‘Accelerate’ for running inference on a PMC-Llama model using a GPU and CPU.
Here is my code,
from accelerate import init_empty_weights
from transformers import AutoConfig, AutoModelForCausalLM
config = AutoConfig.from_pretrained("chaoyi-wu/PMC_LLAMA_7B")
model = AutoModelForCausalLM.from_config(config)
from accelerate import infer_auto_device_map, init_empty_weights
device_map = infer_auto_device_map(model)
# Getting the warning: The model weights are not tied. Please use the `tie_weights` method before using the `infer_auto_device` function.
I am not able to understand why I am getting this error even though I have added the statement, ‘model.tie_weights()’
Thank you. Any help would be highly appreciated.
I had this same issue, if anything could please answer the OP question that would be super helpful!
upgrading to transformers >= 4.28.1
!pip install transformers==4.28.1
@Falah Thanks! I am already using transformers version - 4.30.2 . And still am getting the error.
u can resolve it in google colab
Thanks @Falah , I tried running my code without the ‘model.tie_weights()’ line. It still doesn’t give the warning. I tried running on google colab. Should I ignore the warning?
If you are sure that the warning was triggered by the presence of
model.tie_weights(), and the warning disappears when that line is removed, then it’s likely that the warning is related to the usage of
model.tie_weights() in your specific context.
Without knowing the specific code and the warning message you received, it’s challenging to determine the exact cause of the warning and whether it is safe to ignore. However, I can provide some general guidance:
- Understand the purpose of
- Check the documentation and version compatibility:
Make sure to refer to the documentation of the library or framework you are using for
- Consider the impact on your specific use case:
Depending on your specific use case, tying weights may or may not be necessary. If you are achieving the desired results without using
model.tie_weights(), and the warning disappears after removing it, it might be acceptable to leave it out. However, it’s crucial to consider the implications on your model’s performance and training process.
- Test and validate your results:
If you decide to proceed without using
model.tie_weights(), thoroughly test your model and validate the results to ensure that it performs as expected and achieves the desired outcomes.