Hi all, I’m just getting started with fine tuning HF models in keras. I’ve been using the following code:
model.compile(
optimizer=Adam(learning_rate=learning_rate),
metrics = [‘accuracy’])
history = model.fit(train_dataset,
epochs = epochs,
batch_size = batch_size,
validation_data = test_dataset,
callbacks = [LearningRateScheduler(lr_scheduler),
EarlyStopping(monitor=‘val_loss’, patience=5)],
verbose = 1)
and then running
model.save_pretrained(model_path)
and …from_pretrained(model_path).
I’m using a learning rate scheduler and often the best model occurs several epochs before the end of training. When I was training my own models I would be using
ModelCheckpoint(filepath=model_path, monitor=‘val_loss’, mode=‘min’, save_best_only=True),
which lets me load the best model.
But when using model.save_pretrained() and …from_pretrained() for a pretrained HF model how can I get it to load the best model instead of the last one?
Thanks for your help!