TRANSFORMER_CACHE issue inside jupyter notebook

Hi, I’m excited to start trying out the huggingface Transformers. I followed the installation instructions with a virtual environment installation in a directory A. The system wise default cache directory is in my home directory, B. There is a strict disk space limit on B. So I changed the TRANSFORMERS_CACHE to somewhere under A where there is no disk limit.

This works fine under command line. I then installed a ipykernel with my virtual env into jupyter notebook. Here things get werid. I checked and made sure the TRANSFORMERS_CACHE is indeed still pointing to A. However, every time I start to use some pretrained model, it will still look for cache in B, not A.

Anyone knows how to solve this basic issue?