Qwen failed to build on Huggingface Inference Endpoint

Model: Qwen/Qwen1.5-14B-Chat

Platform: Huggingface Inference Endpoints,

Error log:
2024/02/23 09:08:17 ~ 2024-02-23 01:08:17,369 | INFO | Initializing model from directory:/repository
2024/02/23 09:08:17 ~ 2024-02-23 01:08:17,369 | INFO | No custom pipeline found at /repository/handler.py
2024/02/23 09:08:17 ~ 2024-02-23 01:08:17,369 | INFO | Using device GPU
2024/02/23 09:08:17 ~ Traceback (most recent call last):
2024/02/23 09:08:17 ~ File “/opt/conda/lib/python3.9/site-packages/starlette/routing.py”, line 705, in lifespan
2024/02/23 09:08:17 ~ async with self.lifespan_context(app) as maybe_state:
2024/02/23 09:08:17 ~ File “/opt/conda/lib/python3.9/site-packages/starlette/routing.py”, line 584, in aenter
2024/02/23 09:08:17 ~ await self._router.startup()
2024/02/23 09:08:17 ~ File “/opt/conda/lib/python3.9/site-packages/starlette/routing.py”, line 682, in startup
2024/02/23 09:08:17 ~ await handler()
2024/02/23 09:08:17 ~ File “/app/webservice_starlette.py”, line 57, in some_startup_task
2024/02/23 09:08:17 ~ inference_handler = get_inference_handler_either_custom_or_default_handler(HF_MODEL_DIR, task=HF_TASK)
2024/02/23 09:08:17 ~ File “/app/huggingface_inference_toolkit/handler.py”, line 45, in get_inference_handler_either_custom_or_default_handler
2024/02/23 09:08:17 ~ return HuggingFaceHandler(model_dir=model_dir, task=task)
2024/02/23 09:08:17 ~ File “/app/huggingface_inference_toolkit/handler.py”, line 17, in init
2024/02/23 09:08:17 ~ self.pipeline = get_pipeline(model_dir=model_dir, task=task)
2024/02/23 09:08:17 ~ File “/app/huggingface_inference_toolkit/utils.py”, line 261, in get_pipeline
2024/02/23 09:08:17 ~ hf_pipeline = pipeline(task=task, model=model_dir, device=device, **kwargs)
2024/02/23 09:08:17 ~ File “/opt/conda/lib/python3.9/site-packages/transformers/pipelines/init.py”, line 705, in pipeline
2024/02/23 09:08:17 ~ config = AutoConfig.from_pretrained(model, _from_pipeline=task, **hub_kwargs, **model_kwargs)
2024/02/23 09:08:17 ~ File “/opt/conda/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py”, line 998, in from_pretrained
2024/02/23 09:08:17 ~ config_class = CONFIG_MAPPING[config_dict[“model_type”]]
2024/02/23 09:08:17 ~ File “/opt/conda/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py”, line 710, in getitem
2024/02/23 09:08:17 ~ raise KeyError(key)
2024/02/23 09:08:17 ~ KeyError: ‘qwen2’
v19rg 2024-02-23T01:08:17.370+00:00
2024/02/23 09:08:17 ~ Application startup failed. Exiting.

Anyone know how to fix this?