Small Decoder-only model < 1B parameters

Are there any decoder-only llama, mistral, gemma or otherwise that has < 1B parameters?

Any recommendations, esp. ones that are good at multilingual tasks?