Can't save ConvBert tokenizer

When i try to use tokenizer.save_pretrained()
i get this error

PanicException                            Traceback (most recent call last)
<ipython-input-9-d95441fe8bb1> in <module>()
----> 1 tokenizer.save_pretrained('jebac_huggingface')

1 frames
/usr/local/lib/python3.7/dist-packages/transformers/tokenization_utils_base.py in save_pretrained(self, save_directory, legacy_format, filename_prefix, push_to_hub, **kwargs)
   2106             file_names=file_names,
   2107             legacy_format=legacy_format,
-> 2108             filename_prefix=filename_prefix,
   2109         )
   2110 

/usr/local/lib/python3.7/dist-packages/transformers/tokenization_utils_fast.py in _save_pretrained(self, save_directory, file_names, legacy_format, filename_prefix)
    597                 save_directory, (filename_prefix + "-" if filename_prefix else "") + TOKENIZER_FILE
    598             )
--> 599             self.backend_tokenizer.save(tokenizer_file)
    600             file_names = file_names + (tokenizer_file,)
    601 

PanicException: no entry found for key

what can i do about it??

solved by:

tokenizer.save_pretrained(save_dir, legacy_format=True)