Code-Mini-v0.1: An Experiment in Minimalist Code Generation

Title: Code-Mini-v0.1: An Experiment in Minimalist Code Generation

I’ve built something a bit different, and I want to share it with the Hugging Face community.

Code-Mini-v0.1 is a 90,000-parameter Transformer Decoder, trained from scratch on Python code. It’s a model that doesn’t aim for perfection. It doesn’t try to compete with giants like GPT or any code completion tools. The goal? To see how far you can push a tiny model before it collapses under its own limitations.

This model isn’t useful in the traditional sense. It won’t generate perfect code or handle complex scenarios. But it’s meant to show what happens when you take a Transformer and reduce it to its absolute minimum - what survives, and what breaks.

Capabilities:

  • Complete simple Python imports

  • Autocomplete basic function headers

  • Mimic code structure

Limitations:

  • Doesn’t handle long or complex code

  • Frequently generates nonsensical or malformed tokens

  • Vocabulary too small for larger token sets

It’s not about performance—it’s about observation. What can we learn from pushing a model to the smallest parameter set possible? How does it break, and why?

Link to the model: Code-Mini-v0.1 on Hugging Face

Would love to hear thoughts from the community, especially anyone experimenting with small-scale transformers or minimalistic models.

1 Like

Welcome to posting @Harley-ml

1 Like

Good on you for pushing the boundaries and documenting what works and what doesn’t with minimal transformers. Just a heads up: the field is moving fast, and there are more efficient approaches out there for small models. A lot of people are still stuck burning through dozens of epochs or using huge datasets for tiny returns.

If you keep experimenting, try rethinking your training pipeline and data processing. Sometimes it’s not about the number of epochs or model size, but how you process and structure the data.

Keep going. The breakthroughs always come from people willing to break the mold.
Pasted image (2)

1 Like