
In a significant stride towards democratizing artificial intelligence, Hugging Face has introduced a new family of compact language models dubbed SmolLM2. These models, designed for high performance while requiring substantially lower computational power, come in three sizes: 135 million, 360 million, and 1.7 billion parameters. Their compact nature allows them to be deployed on devices with limited processing power and