Interest in Contributing PEFT Educational Resources - Seeking Community Input

Hi HF community,

I’ve been exploring HF’s libraries and documentation, with particular interest in PEFT techniques. I’ve worked with transformers both through practical implementation (using HF’s transformers and datasets libraries for BERT fine-tuning) and theoretical understanding (implementing transformer architecture from scratch using PyTorch). I also trained 3D object detection models for industrial use in my last job.

I’m interested in contributing educational content to help other learners in this space. Before diving in, I’d love to understand:

  1. Are there specific PEFT concepts/techniques learners are most interested in?
  2. What format of educational content (demos, tutorials, comparisons) has been most helpful for your community?
  3. Are there gaps in the current PEFT documentation/examples you’d like to see filled?

I’m happy to put together some specific proposals based on your feedback. My goal is to create resources that would be genuinely useful to the community.

Looking forward to your thoughts and suggestions!

2 Likes

Learners are most interested in LoRA, Prefix Tuning, Adapters, and comparisons of PEFT techniques for efficiency and task suitability. Hands-on tutorials, interactive demos, and real-world case studies are the most effective formats for learning. Current gaps include examples for low-resource settings, evaluation best practices, and applications beyond NLP.

2 Likes

Thanks for this detailed feedback!
Really helpful to understand what the community finds most valuable. This gives me a clear direction for potentially contributing - particularly interested in the low-resource settings angle since that could help make these techniques more accessible to more learners.

2 Likes