Delete search term

Header

Main navigation

CAI contributes to Swiss Open LLM release

ZHAW's Centre for Artificial Intelligence (CAI) has been involved in the development of Switzerland's first large language model, a 70-billion-parameter system, as part of the SwissAI Initiative. The model is scheduled for release in late Summer 2025.

The SwissAI Initiative, led by ETH Zurich and EPFL with over 800 researchers from 10+ Swiss institutions, represents Europe's largest open science effort for AI foundation models. ZHAW's Centre for Artificial Intelligence (CAI) is part of the initiative since its inception in August 2023, being involved in several verticals on, e.g., health or vision for robotics, and contributing actively to the horizontal on data for LLMs.

For example, Dr. Jan Deriu from CAI’s NLP group supported the collection of high-quality, Swiss-centric training data, ensuring compliance with Swiss privacy laws, copyright regulations, and website crawling guidelines.

The model will be released as fully open-source, with complete transparency in training data, code, and weights, under the Apache 2.0 License. The multilingual model supports over 1,000 languages and will be available in 8B and 70B parameter versions. Training was conducted on the "Alps" supercomputer, which utilizes 100% carbon-neutral electricity. The initiative aims to reduce European dependence on closed commercial AI systems from the US and China. See the full press release of the SwissAI Initiative here: https://ethz.ch/en/news-and-events/eth-news/news/2025/07/a-language-model-built-for-the-public-good.html.

The SwissAI initiative goes beyond building LLMs, aiming to do AI for good. It also supported a recently released ICML paper coauthored by researchers from ETH Zurich and CAI, which explored the internal workings of Transformer models and found symmetries that could ultimately lead to more energy-efficient training of large models. Read the full paper here: https://arxiv.org/pdf/2502.10927.

Jan Deriu