Fast_bert using my finetuned model

I created a sentiment analysis model using one of the Huggingface models, i got good accuracy, i saved it and now i want to use it without passing by the training process again,
I used the BertClassificationPredictor function but it didn’t work for me.

My work :

from fast_bert.prediction import BertClassificationPredictor
MODEL_PATH = ‘/content/latest_model’
predictor = BertClassificationPredictor(
model_path=MODEL_PATH,
label_path=LABEL_PATH,
multi_label=False,
model_type=‘bert’)

Single prediction

single_prediction = predictor.predict(“just get me result for this text”)

The model i finetuned : https://huggingface.co/asafaya/bert-base-arabic

Hi @AhmedBou better to ask this on fast_bert issues. Also if you could post the stack trace then I can take a look.

Hi, thank you for your reply, here’s the stack trace:


UnicodeDecodeError Traceback (most recent call last)
in ()
7 label_path=LABEL_PATH, # location for labels.csv file
8 multi_label=False,
----> 9 model_type=‘bert’)
10
11 # Single prediction

/usr/lib/python3.6/codecs.py in decode(self, input, final)
319 # decode input (taking the buffer into account)
320 data = self.buffer + input
–> 321 (result, consumed) = self._buffer_decode(data, self.errors, final)
322 # keep undecoded input until the next call
323 self.buffer = data[consumed:]

UnicodeDecodeError: ‘utf-8’ codec can’t decode byte 0x80 in position 64: invalid start byte

Do you think you can setup a colab for this ?