OOM when allocating using BERT

i got this problem with using embedding for 150.000 sentences using BERT model
tensorflow.python.framework.errors_impl.ResourceExhaustedError: OOM when allocating tensor with shape[1619381,768] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc [Op:RandomUniform]

as the layer of embedding is
embed = Embedding(1619381, 768, mask_zero=True)(sentence)
i got this number 1619381 from embeddings = torch.cat(embeddings,dim=0)
which contains all embedding for 150.000 sentences .

is there any help please