Fine tune Transformers for text generation

Hello

Is there an example like this one (Fine-tune a pretrained model) for fine tuning HF transformers for text generation?

@mwitiderrick Hello :hugs:
You can check out this link for all example notebooks.

Hello @merve, thanks for the response, indeed I found the notebooks very useful. One follow-up question.

When I run predictions like this for a binary problem

import tensorflow as tf
predicted_class_id = int(tf.math.argmax(logits, axis=-1)[0])
bert.config.id2label[predicted_class_id]

I get the result as LABEL_1, How do I know if this is the prediction for class 0 or 1.

Thanks.

Hello Derrick,

Can you send me the model repo so that I can see the config file?

Hello @merve

Not sure about the repo but the model is

from transformers import TFAutoModelForSequenceClassification
model = TFAutoModelForSequenceClassification.from_pretrained("bert-base-uncased", num_labels=2)

Hello Derrick,

Sorry it’s my fault, label is already a label :sweat_smile: I meant, which dataset is the model fine-tuned on?

Hello @merve
It’s imdb

dataset = load_dataset("imdb")

Any update on this @merve ?

@mwitiderrick In the page of the dataset you can see the label for 1 is positive.

Wanted to clarify that LABEL_1 means label 1 and not 0