I'm trying to run a facebook/XGLM model but have issues

Hello,

I’ve tried deploying the XGLM model on Sagemaker but it wasn’t working. So i tried to load the model as a PreTrainedModel with a PretrainedConfig. However i’m finding that there is no actual support for this text-generative model supported by Hugging Face like the other models. When I try to use AutoConfig it say’s Hugging face doesn’t have a support for this model.

I was able to load the model since it was a ‘.bin’ format which I think is a PyTorch model, however we could also convert it to a TensorflowModel using the TFPretrainedModel, when I load the model using the initial method I get a message where the weights of the model checkpoint were not initialized contain most of the layers.

However when I load with the TFPretrainedModel it loads but asks for the input and output dimensions. My end goal is to utilize this model to build a API around it, can I get any help or support from Hugging Face from this?

from transformers import PreTrainedModel, TFTrainedModel
from transformers import PretrainedConfig
import json

def open_json(path):
    with open(path, 'r') as f:
        data = json.load(f)
        return data

def main():
    json_config=open_json(base_path+'config.json')
    config=json.dumps(json_config)
    cf=PretrainedConfig(name_or_path =config)
    model =PreTrainedModel.from_pretrained(pretrained_model_name_or_path  = './',config=cf)
    model=TFPretrainedModel.from_pretrained(pretrained_model_name_or_path = './',config=cf, from_pt = True)

main()