I’m trying to deploy an application using Gradio, but I’m using kubernetes. My main issue is that, when something does not work with the inference function, the app is not shut down as it should. Because of that, the pod continues to run and another new pod is not created.
What is the best way to handle this issue?
import gradio as gr import pandas as pd from main_module.main import run_inference def read_file(file_object): """ read input file """ df = pd.read_excel(file_object.name, engine="openpyxl") output: pd.DataFrame = run_inference(df) file_output_path = "result.xlsx" with open(file_output_path, "wb") as file: output.to_excel(file, index=False, encoding="utf-8") return file_output_path def create_gradio(): """ create gradio app """ iface = gr.Interface(fn=read_file, inputs="file", outputs="file") try: iface.launch(server_name="0.0.0.0", server_port=8080) except KeyboardInterrupt: iface.close() except Exception as e: print(e) iface.close() if __name__ == "__main__": create_gradio()