Lower Memory Usage for TF GPT-J

We can use this way of lowering the memory usage for PT but how about Tensorflow?

from transformers import GPTJForCausalLM
import torch

model = GPTJForCausalLM.from_pretrained(
    "EleutherAI/gpt-j-6B", revision="float16", torch_dtype=torch.float16, low_cpu_mem_usage=True

Seems like both the “revision” and the “low_cpu_mem_usage” parameters do not exist in TF.

May I know whether there is any technique I can use to lower the memory usage for TF as my 3090 cannot load the GPT-J-6B model?