Subscribe for free access to Data Points!
Switzerland launched Apertus, a national LLM developed by EPFL, ETH Zurich, and the Swiss National Supercomputing Centre as an alternative to ChatGPT, Meta’s Llama models, and DeepSeek. The model comes in two sizes — 8 billion and 70 billion parameters — and was trained on 15 trillion tokens across more than 1,000 languages. 40 percent of Apertus’s training data is non-English, including underrepresented languages like Swiss German and Romansh. Unlike commercial models, Apertus’s (from the Latin for “open”) architecture, model weights, training data, and development recipes are all openly accessible and fully documented, ensuring compliance with Swiss data protection laws and EU AI Act transparency requirements. The models are freely available under a permissive open source license for educational, research, and commercial applications, with deployment supported through platforms like Transformers, vLLM, SGLang and MLX. (Swiss AI)