When enabling phi-3-small on non-cuda devices, the assert of flash_attn package’s availability in init makes it difficult to run w/o modeling script changes. Can we replace the assert with a warning message? In this way it offers a chance for modeling to be replaced afterwards with alternative implementations of flash_attn.
Hi,
Phi-3 is available natively in the Transformers library, hence please use from_pretrained
without specifying trust_remote_code=True
.
Docs: Phi-3
Thanks Niels. With transformers.from_pretrained(trust_remote_code=False), phi-3-small seems insistently asking to set it as True. Do you know any plan for transformers to support phi3-small natively? Seeking for a way to mitigate the assertion in the init function of the original modeling script.
Python 3.10.14 | packaged by conda-forge | (main, Mar 20 2024, 12:45:18) [GCC 12.3.0] on linux
Type “help”, “copyright”, “credits” or “license” for more information.
import torch
from transformers import AutoModelForCausalLM
model=“microsoft/Phi-3-small-8k-instruct”
model = AutoModelForCausalLM.from_pretrained(model, torch_dtype=“auto”, trust_remote_code=False)
config.json: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1.57k/1.57k [00:00<00:00, 15.7MB/s]
Traceback (most recent call last):
File “”, line 1, in
File “/home/xfeng8/miniforge3/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py”, line 524, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File “/home/xfeng8/miniforge3/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py”, line 979, in from_pretrained
trust_remote_code = resolve_trust_remote_code(
File “/home/xfeng8/miniforge3/lib/python3.10/site-packages/transformers/dynamic_module_utils.py”, line 640, in resolve_trust_remote_code
raise ValueError(
ValueError: Loading microsoft/Phi-3-small-8k-instruct requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the optiontrust_remote_code=True
to remove this error.
$ pip list|grep transformers
transformers 4.44.0