European Open Source AI Index
DatabaseNewsGuidesAboutContribute

Parameter descriptions:

Base Model Data
Are datasources for training the base model comprehensively documented and made available? In case a distinction between base (foundation) and end (user) model is not applicable, this mirrors the end model data entries.
End User Model Data
Are datasources for training the model that the end user interacts with comprehensively documented and made available?
Base Model Weights
Are the weights of the base models made freely available? In case a distinction between base (foundation) and end (user) model is not applicable, this mirrors the end model data entries.
End User Model Weights
Are the weights of the model that the end user interacts with made freely available?
Training Code
Is the source code of dataset processing, model training and tuning comprehensively made available?
Code Documentation
Is the source code of datasource processing, model training and tuning comprehensively documented?
Hardware Architecture
Is the hardware architecture used for datasource processing and model training comprehensively documented?
Preprint
Are archived preprint(s) are available that detail all major parts of the system including datasource processing, model training and tuning steps?
Paper
Are peer-reviewed scientific publications available that detail all major parts of the system including datasource processing, model training and tuning steps?
Modelcard
Is a model card available in standardized format that provides comprehensive insight on model architecture, training, fine-tuning, and evaluation?
Datasheet
Is a datasheet as defined in "Datasheets for Datasets" (Gebru et al. 2021) available?
Package
Is a packaged release of the model available on a software repository (e.g. a Python Package Index, Homebrew)?
API and Meta Prompts
Is an API available that provides unrestricted access to the model (other than security and CDN restrictions)? If applicable, this entry also collects information on the use and availability of meta prompts.
Licenses
Is the project fully covered by Open Source Initiative (OSI)-approved licenses, including all data sources and training pipeline code?

OLMo

by Ai2

Open LLM trained from scratch by Allen AI.
Text
Full
https://huggingface.co/allenai/Olmo-3-32B-Think
Olmo-3-1125-32B
Olmo-3-32B-Think
Apache-2.0
Allen Institute for AI (non-profit research institute)
https://allenai.org
November 2024
Availability
Base Model Data
Training data for base model released and use documented.
https://huggingface.co/datasets/allenai/dolma3_mix-5.5T-1125https://huggingface.co/datasets/allenai/dolma3_longmino_mix-100B-1125https://huggingface.co/datasets/allenai/dolma3_dolmino_mix-100B-1125
End User Model Data
Data for fine-tuning published in a well-organized manner.
https://huggingface.co/datasets/allenai/Dolci-Think-RL-32Bhttps://huggingface.co/datasets/allenai/Dolci-Instruct-RL-7Bhttps://huggingface.co/datasets/allenai/Dolci-Think-DPO-32Bhttps://huggingface.co/datasets/allenai/Dolci-Instruct-DPO-7Bhttps://huggingface.co/datasets/allenai/Dolci-Think-SFT-32Bhttps://huggingface.co/datasets/allenai/Dolci-Instruct-SFT-7B
Base Model Weights
Model weights made available on HuggingFace.
https://huggingface.co/allenai/Olmo-3-1125-32B
End User Model Weights
Model weights made available on HuggingFace.
https://huggingface.co/allenai/Olmo-3-32B-Think
Training Code
Multiple repos with training, architecture and fine-tuning code available.
https://github.com/allenai/open-instructhttps://github.com/allenai/OLMo-core
Documentation
Code Documentation
Repositories and code well-described, commented and documented.
https://github.com/allenai/open-instructhttps://github.com/allenai/OLMo-core
Hardware Architecture
Architecture documented in requisite detail.
https://aclanthology.org/2024.acl-long.841/
Preprint
Pre-print goes into impressive detail about the data, training process, architecture, and evaluation.
https://www.datocms-assets.com/64837/1765558567-olmo_3_technical_report-4.pdf
Paper
Conference paper published in COLM 2025. More under review?
https://openreview.net/forum?id=2ezugTT9kU#discussion
Modelcard
Model card provides broad overview and links to full details.
https://huggingface.co/allenai/Olmo-3-32B-Think
Datasheet
Data sheets are well-documented and provide requisite info.
https://huggingface.co/datasets/allenai/dolma3_mix-5.5T-1125https://huggingface.co/datasets/allenai/dolma3_longmino_mix-100B-1125https://huggingface.co/datasets/allenai/dolma3_dolmino_mix-100B-1125https://huggingface.co/datasets/allenai/Dolci-Think-RL-32Bhttps://huggingface.co/datasets/allenai/Dolci-Instruct-RL-7Bhttps://huggingface.co/datasets/allenai/Dolci-Think-DPO-32Bhttps://huggingface.co/datasets/allenai/Dolci-Instruct-DPO-7Bhttps://huggingface.co/datasets/allenai/Dolci-Think-SFT-32Bhttps://huggingface.co/datasets/allenai/Dolci-Instruct-SFT-7B
Access
Package
Model available on Ollama.
https://ollama.com/library/olmo2
API and Meta Prompts
Available through HuggingFace. Inference endpoints are available.
https://huggingface.co/allenai/Olmo-3-32B-Think
Licenses
Apache 2.0, an OSI-approved license.
https://huggingface.co/allenai/Olmo-3-32B-Think#model-description
Is this information not up to date?
Contribute here ->
Supported by the Centre for Language Studies and the Dutch Research Council. Website design & development © 2024 by BSTN. This version of the index generated 26 December 2025, website content last updated 30 December 2025.