European Open Source AI Index
DatabaseNewsGuidesAboutContribute

Viking

by Silo AI and TurkuNLP and High Performance Language Technologies (HPLT)

Multilingual model trained on Nordic languages, English, and code. Also available as 7B and 13B models.
Text
Full
https://huggingface.co/LumiOpen/Viking-33B
(undefined)
Viking-33B
Apache-2.0
Silo AI was acquired by AMD in August 2024
[ "https://www.silo.ai", "https://turkunlp.org", "https://hplt-project.org" ]
May 2024
Availability
Base Model Data
Trained on SlimPajama, Starcoder and mc4. However, exact details are yet to be published.
End User Model Data
Same as base model. No additional fine-tuning data specified.
Base Model Weights
Model weights available at various training checkpoints.
https://huggingface.co/LumiOpen/Viking-7B
End User Model Weights
https://huggingface.co/LumiOpen/Viking-7B
Training Code
No details of training code have been released
Documentation
Code Documentation
Hardware Architecture
Uses a LLaMA-like GPT architecture. However, no further details have been provided.
Preprint
No preprint found. Some information published through blog post.
https://www.amd.com/en/blogs/2024/viking-7b-13b-33b-sailing-the-nordic-seas-of-multilinguality.html
Paper
No peer reviewed paper found
Modelcard
Model card provides a broad overview; more detailed documentation is forthcoming.
https://huggingface.co/LumiOpen/Viking-33B
Datasheet
On HuggingFace "Viking is being trained on a 2 trillion token mixed dataset of English, Finnish, Swedish, Danish, Norwegian, Icelandic and code. Full details will be published soon."
Access
Licenses
Apache 2.0, unclear if both weights and code are under it though.
Last updated 13 March 2026
Is this information not up to date?
Contribute here ->
Supported by the Centre for Language Studies and the Dutch Research Council. Website design & development © 2024 by BSTN. This version of the index generated 09 April 2026, website content last updated 11 March 2026.