config = BertConfig.from_json_file('./bert_model/bert_config.json') model =TFBertModel.from_pretrained('bert_model/bert_model.ckpt',config=config)
or config = BertConfig.from_json_file('./bert_model/bert_config.json') model = TFBertModel(config).load_weights('bert_model/bert_model.ckpt')
thanks for your reply.
I still have some question here.
should path/to/your/bert/directory is path/ or path/model.ckpt
If I use path here, there is an error: OSError: Error no file named ['pytorch_model.bin', 'tf_model.h5'] found in directory bert_model/ or from_pt set to False
If I use path/model.ckpt ,there is another error: **OSError: Unable to load weights from h5 file. If you tried to load a TF 2.0 model from a PyTorch checkpoint, please set from_pt=True.
It seem like the code can’t recognize my ckpt file.
code is like this: config = BertConfig.from_json_file('./bert_model/bert_config.json') model = TFBertModel.from_pretrained('bert_model/bert_model.ckpt', config=config)
As far as I know, the path/to/your/bert/directory in the `from_pretrained function should point to the root of the directory where the model / tokenizer files are stored, not individual files.
Can you share the contents of your directory? It seems there might be some files missing. Also, are you trying to load a model that was trained in PyTorch or TensorFlow?
Thanks for clarification - I see in the docs that one can indeed point from_pretrained a TF checkpoint file:
A path or url to a tensorflow index checkpoint file (e.g, ./tf_model/model.ckpt.index ). In this case, from_tf should be set to True and a configuration object should be provided as config argument. This loading path is slower than converting the TensorFlow checkpoint in a PyTorch model using the provided conversion scripts and loading the PyTorch model afterwards.
It seems that what you’re missing is the from_tf=True argument, so maybe something like the following works:
config = BertConfig.from_pretrained("path/to/your/bert/directory")
model = TFBertModel.from_pretrained("path/to/bert_model.ckpt.index", config=config, from_tf=True)
I’m not sure whether the config should be loaded with from_pretrained or from_json_file but maybe you can test both to see which one works
I add from_tf in it.
But it have an error(the version of transformers is the lastest one) TypeError: ('Keyword argument not understood:', 'from_tf')
It seem like parameter from_tf just use for pytorch,and tensorflow just have parameter from_pt.
docs are different in pytorch and tensorflow, tf version of docs just have from_pt.