How to do BERTopic inference endpoint correctly? (hugging face)

I’m trying to create an endpoint using a custom endpoint handler by loading a trained model. I understand that my code is not very correct, but at least it works locally. Can you tell me how to do it correctly? I spent the whole day, but I couldn’t find a single clue. AWS crashes with an error at the launch stage of the inference.

Attempts to load using AutoModel Class ended up with me not knowing where to get config.json.

from typing import Dict, List, Any
from bertopic import BERTopic
from transformers import pipeline, AutoModel, AutoConfig
from sentence_transformers import SentenceTransformer

class EndpointHandler():
    def __init__(self, path=""):
        self.model = BERTopic.load(/repository/model)
        self.model.calculate_probabilities = False


    def __call__(self, data: Dict[str, Any]) -> List[Dict[str, Any]]:
        """
       data args:
            text (:obj: `str`)
      Return:
            A :obj:`list` | `dict`: will be serialized and returned
        """
        # get inputs
        text = data.pop("text")


        # run normal prediction
        prediction = self.model.transform([text])
        return prediction