Error loading custom model

I followed the guide here Building custom models to share a custom pytorch model I trained.

Now when I try to load the model using from_pretrained it gives me error.

ValueError: The model class you are passing has a config_class attribute that is not consistent with the config class you passed (model has <class 'reddit_comments_model.configuration_reddit_comments.ConfigRedditComments'> and you passed <class 'transformers_modules.varghesebabu.RedditComments2022.b3144afd5081ab4b5208c6b1d73b997ee8bcc9d6.configuration_reddit_comments.ConfigRedditComments'>. Fix one of those so they match!

My config.json

{ "architectures": [ "RedditComments" ], "auto_map": { "AutoConfig": "configuration_reddit_comments.ConfigRedditComments", "AutoModel": "modeling_reddit_comments.RedditComments" }, "embed_dim": 512, "model_type": "reddit_comments", "n_heads": 16, "num_layers": 16, "torch_dtype": "float32", "transformers_version": "4.48.1", "vocab_size": 50258 }

What should I do?

EDIT : Looks like a bug in this version .

It works when I used with transformers version 4.40.2