How to train a model designed by myself with the Transformer Framework

Hello everyone.

Since I met Hugging Face the first time, I have gone around exploring it with its own NLP course and other documentation for a while. But I haven’t seen any instructions about training architectures that I designed on my own. The most relevant thing is to load an existing architecture without parameters and train it from scratch.

So, after I design my own architecture (with Tensorflow since I don’t think Transformers supports this), how can I make use of its Dataset, Tokenizer, or even Trainer and Accelerator libraries?

My model class will look like:

class Transformer(tf.keras.Model):
def init(self, *, arguments):
super().init()
self.layer1 = something

   self.layer2 = songthing

   self.final_layer = somehting

def call(self, inputs):

and after preprocess data, i want to use

prepare_tf_dataset

and then start straining. It would be great if Accelerate, Trainer and other thing can also apply.

Thank you guys so much.