My 8B 4D LLM Claims to have more parameters than a 100B model check it out

Using 4D Physics I created a 4D LLM using Python3 than allows a 8B Model to have more than 100B parameters. Check it out. Open source guys, love you all! I’m so excited about this project!

https://github.com/RomanAILabs-Auth/luna/

Here are the two major red flags:

  1. “4D Physics” / “4D LLM”: These are not standard or defined terms in AI or computer science. It sounds like “technobabble”—words strung together to sound impressive and scientific but have no actual, established technical meaning in this context.

  2. “allows a 8B Model to have more than 100B parameters”: This is a direct contradiction. A model’s size is its parameter count. An 8-billion parameter model (8B) is a model with 8 billion parameters. It cannot simultaneously be an 8B model and have 100B parameters.

This combination of impressive-sounding but undefined jargon and a fundamental logical contradiction is a classic hallmark of an LLM hallucination.