Hello there!
IΒ΄ve been working on Translation tasks fine-tunning some examples from the Transformers repository.
The BLEU score is reasonably good , but the Inference mode in the HUB for translation returns an empty string or a point and an error in local that Im not able to move from.
Managed to upload it sucessfully to the Hub
When the model was uploaded to the Hub, it was uploaded with another task, and I changed it following the recommendations from this issue
However, when I run Inferece it returns an empty string
When I try to do it locally via this script, it returned the error
python inference.py --model_name="SoyGema/english-hebrew" --input_text="Hello, my name is Sara"
Downloading (β¦)lve/main/config.json: 100%|βββββββββββββββββββββββββ| 1.67k/1.67k [00:00<00:00, 3.64MB/s]
Downloading model.safetensors: 100%|βββββββββββββββββββββββββββββββββ| 242M/242M [00:07<00:00, 31.4MB/s]
Downloading (β¦)neration_config.json: 100%|ββββββββββββββββββββββββββββββ| 117/117 [00:00<00:00, 422kB/s]
Downloading (β¦)okenizer_config.json: 100%|βββββββββββββββββββββββββ| 2.32k/2.32k [00:00<00:00, 6.87MB/s]
Downloading spiece.model: 100%|ββββββββββββββββββββββββββββββββββββββ| 792k/792k [00:00<00:00, 2.08MB/s]
Downloading (β¦)/main/tokenizer.json: 100%|βββββββββββββββββββββββββ| 2.42M/2.42M [00:00<00:00, 3.93MB/s]
Downloading (β¦)cial_tokens_map.json: 100%|βββββββββββββββββββββββββ| 2.20k/2.20k [00:00<00:00, 7.49MB/s]
/Users/gema/Documents/The-Lord-of-The-Words-The-two-frameworks/.torchenv/lib/python3.10/site-packages/transformers/pipelines/__init__.py:1003: UserWarning: "translation" task was used, instead of "translation_XX_to_YY", defaulting to "translation_en_to_de"
warnings.warn(
Translated text: , .
So moving from that what I did is changing the config.json file adding the language at hand but I wasnΒ΄t able to make it work. It returns an empty string as well, but it seems to be catching another default configβ¦
Any idea of how to move from this point would be much appreciated !
Thanks in advance!