Compatibility Issues Between Diffusers and Huggingface_hub Causing Tokenizer Build Failures in Kaggle Environment

Hello community,

I have been working on running Stable Diffusion pipelines on Kaggle using the diffusers and huggingface_hub libraries. A few days ago, everything was working fine with a specific combination of library versions, but recently my environment started to break due to dependency conflicts, specifically related to the tokenizers package build failures.

The main issue appears to stem from compatibility between different versions of diffusers, huggingface_hub, and related packages. Despite multiple attempts to downgrade or upgrade versions, I keep running into errors such as:

  • Failed building wheel for tokenizers
  • ImportErrors related to huggingface_hub modules
  • Dependency resolver conflicts reported by pip

I am using the Kaggle environment with GPU enabled and need a stable setup that does not break the tokenizers build or cause runtime errors.

Could anyone share a working set of compatible versions for these libraries (diffusers, huggingface_hub, transformers, tokenizers, accelerate) that have been tested recently in Kaggle? Also, are there any recommended installation steps or environment settings to avoid these conflicts?

Thank you in advance for your help!
Beni

1 Like

Failed building wheel for tokenizers

Other than that, this is a common issue with Python, but it’s pretty rare. It seems that the version of Python itself can sometimes be a factor.

When using Transformers with Diffusers, it’s better to use the latest version if you’re using the latest model, but if you’re using an older model, versions prior to the major overhaul might work fine. (transformers<=4.48.3)