Inference Endpoint Pix2Struct Error

I am trying to deploy a Pix2StructForConditionalGeneration model but I am getting this error:

2023/07/30 20:54:22 ~ async with self.lifespan_context(app) as maybe_state:
2023/07/30 20:54:22 ~ from transformers import Pix2StructForConditionalGeneration, Pix2StructProcessor
glcr8 2023-07-31T00:54:22.607Z
2023/07/30 20:54:22 ~ await handler()
2023/07/30 20:54:22 ~ File "/opt/conda/lib/python3.9/site-packages/starlette/routing.py", line 677, in lifespan
2023/07/30 20:54:22 ~ File "/repository/handler.py", line 2, in <module>
2023/07/30 20:54:22 ~ File "/app/huggingface_inference_toolkit/utils.py", line 192, in check_and_register_custom_pipeline_from_directory
2023/07/30 20:54:22 ~ File "/app/webservice_starlette.py", line 57, in some_startup_task
2023/07/30 20:54:22 ~ File "<frozen importlib._bootstrap_external>", line 850, in exec_module
2023/07/30 20:54:22 ~ custom_pipeline = check_and_register_custom_pipeline_from_directory(model_dir)
2023/07/30 20:54:22 ~ spec.loader.exec_module(handler)
2023/07/30 20:54:22 ~ Traceback (most recent call last):
2023/07/30 20:54:22 ~ File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
2023/07/30 20:54:22 ~ File "/opt/conda/lib/python3.9/site-packages/starlette/routing.py", line 654, in startup
2023/07/30 20:54:22 ~ Application startup failed. Exiting.
2023/07/30 20:54:22 ~ File "/opt/conda/lib/python3.9/site-packages/starlette/routing.py", line 566, in __aenter__
2023/07/30 20:54:22 ~ inference_handler = get_inference_handler_either_custom_or_default_handler(HF_MODEL_DIR, task=HF_TASK)
2023/07/30 20:54:22 ~ await self._router.startup()
2023/07/30 20:54:22 ~ ImportError: cannot import name 'Pix2StructForConditionalGeneration' from 'transformers' (/opt/conda/lib/python3.9/site-packages/transformers/__init__.py)
2023/07/30 20:54:22 ~ File "/app/huggingface_inference_toolkit/handler.py", line 42, in get_inference_handler_either_custom_or_default_handler
2023/07/30 20:54:50 ~ File "/opt/conda/lib/python3.9/site-packages/starlette/routing.py", line 654, in startup
2023/07/30 20:54:50 ~ Traceback (most recent call last):
2023/07/30 20:54:50 ~ custom_pipeline = check_and_register_custom_pipeline_from_directory(model_dir)
2023/07/30 20:54:50 ~ File "/app/huggingface_inference_toolkit/handler.py", line 42, in get_inference_handler_either_custom_or_default_handler
2023/07/30 20:54:50 ~ await handler()
2023/07/30 20:54:50 ~ File "/opt/conda/lib/python3.9/site-packages/starlette/routing.py", line 566, in __aenter__
2023/07/30 20:54:50 ~ INFO | Found custom pipeline at /repository/handler.py
2023/07/30 20:54:50 ~ inference_handler = get_inference_handler_either_custom_or_default_handler(HF_MODEL_DIR, task=HF_TASK)
2023/07/30 20:54:50 ~ File "/app/webservice_starlette.py", line 57, in some_startup_task
2023/07/30 20:54:50 ~ File "/opt/conda/lib/python3.9/site-packages/starlette/routing.py", line 677, in lifespan
2023/07/30 20:54:50 ~ INFO | Initializing model from directory:/repository
2023/07/30 20:54:50 ~ await self._router.startup()
2023/07/30 20:54:50 ~ async with self.lifespan_context(app) as maybe_state:
2023/07/30 20:54:50 ~ ImportError: cannot import name 'Pix2StructForConditionalGeneration' from 'transformers' (/opt/conda/lib/python3.9/site-packages/transformers/__init__.py)
2023/07/30 20:54:50 ~ from transformers import Pix2StructForConditionalGeneration, Pix2StructProcessor
2023/07/30 20:54:50 ~ Application startup failed. Exiting.
2023/07/30 20:54:50 ~ File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
2023/07/30 20:54:50 ~ File "<frozen importlib._bootstrap_external>", line 850, in exec_module
glcr8 2023-07-31T00:54:50.619Z
2023/07/30 20:54:50 ~ File "/repository/handler.py", line 2, in <module>
2023/07/30 20:54:50 ~ spec.loader.exec_module(handler)
2023/07/30 20:54:50 ~ File "/app/huggingface_inference_toolkit/utils.py", line 192, in check_and_register_custom_pipeline_from_directory

I see the python version 3.9 and transformers version is not compatible.

I also tried loading a custom docker container from here - huggingface/transformers-pytorch-gpu:latest

But I get this error log after it

ktz7w 2023-08-01T17:27:24.001Z
2023/08/01 13:27:24 ~ CUDA Version 11.8.0
2023/08/01 13:27:24 ~ By pulling and using the container, you accept the terms and conditions of this license:
ktz7w 2023-08-01T17:27:24.007Z
ktz7w 2023-08-01T17:27:24.007Z
2023/08/01 13:27:24 ~ https://developer.nvidia.com/ngc/nvidia-deep-learning-container-license
2023/08/01 13:27:24 ~ A copy of this license is made available in this container at /NGC-DL-CONTAINER-LICENSE for your convenience.
2023/08/01 13:27:24 ~ Container image Copyright (c) 2016-2023, NVIDIA CORPORATION & AFFILIATES. All rights reserved.
ktz7w 2023-08-01T17:27:24.007Z
2023/08/01 13:27:24 ~ This container image and its contents are governed by the NVIDIA Deep Learning Container License.
ktz7w 2023-08-01T17:27:24.015Z
2023/08/01 13:28:12 ~ ==========
ktz7w 2023-08-01T17:28:12.012Z
2023/08/01 13:28:12 ~ ==========
2023/08/01 13:28:12 ~ == CUDA ==
ktz7w 2023-08-01T17:28:12.015Z
2023/08/01 13:28:12 ~ CUDA Version 11.8.0
ktz7w 2023-08-01T17:28:12.017Z
2023/08/01 13:28:12 ~ Container image Copyright (c) 2016-2023, NVIDIA CORPORATION & AFFILIATES. All rights reserved.
2023/08/01 13:28:12 ~ This container image and its contents are governed by the NVIDIA Deep Learning Container License.
2023/08/01 13:28:12 ~ https://developer.nvidia.com/ngc/nvidia-deep-learning-container-license
ktz7w 2023-08-01T17:28:12.018Z
ktz7w 2023-08-01T17:28:12.018Z
2023/08/01 13:28:12 ~ A copy of this license is made available in this container at /NGC-DL-CONTAINER-LICENSE for your convenience.
2023/08/01 13:28:12 ~ By pulling and using the container, you accept the terms and conditions of this license:
ktz7w 2023-08-01T17:28:12.028Z

Please help me.