A question about splitting training dataset when do training

Suppose I have training dataset with the size of 50 million, I would like to train it with a batch_size and epoch.

If I separate the dataset into 5 subset with 10 million each one. and I train my model like this: first I train my model with subset_1 under the same batch_size and epoch as I mentioned before, and get Model_1. then I train Model_1 with subset_2 under the same batch_size and epoch, repeat this loop until I get the final Model.

Will this influence the precision of my final Model compared to training on the whole dataset at each epoch with the same batch_size.