European Open Source AI Index
DatabaseNewsGuidesAboutContribute

AMD OLMo

by AMD

Version of OLMo trained from scratch on AMD GPUs.
Code
Limited
https://huggingface.co/amd/AMD-OLMo-1B-SFT-DPO
AMD-OLMo-1B
AMD-OLMo-1B-SFT-DPO
Apache-2.0
AMD, a major chip manufacturer.
https://www.amd.com/en.html
October 2024
Availability
Base Model Data
Data sources used to train the base model shared. Model based on Dolma dataset.
https://huggingface.co/datasets/allenai/dolma
End User Model Data
Model trained on a combination of different datasets.
https://huggingface.co/datasets/allenai/tulu-v2-sft-mixturehttps://huggingface.co/datasets/teknium/OpenHermes-2.5https://huggingface.co/datasets/TIGER-Lab/WebInstructSubhttps://huggingface.co/datasets/m-a-p/Code-Feedbackhttps://huggingface.co/datasets/argilla/ultrafeedback-binarized-preferences-cleaned
Base Model Weights
Weights published on HuggingFace.
https://huggingface.co/amd/AMD-OLMo-1B
End User Model Weights
Weights published on HuggingFace.
https://huggingface.co/amd/AMD-OLMo-1B-SFT-DPO
Training Code
Training code shared on GitHub.
https://github.com/allenai/OLMo
Documentation
Code Documentation
Training code well-documented, and instructions provided on how to set up training in model card.
https://github.com/allenai/OLMohttps://huggingface.co/amd/AMD-OLMo
Hardware Architecture
Hardware architecture shared on model card.
https://huggingface.co/amd/AMD-OLMo-1B-SFT-DPO
Preprint
No preprint available.
https://www.amd.com/en/developer/resources/technical-articles/introducing-the-first-amd-1b-language-model.html
Paper
No peer-reviewed paper found.
Modelcard
Model card provides detail on model architecture, training, fine-tuning, and evaluation.
https://huggingface.co/amd/AMD-OLMo-1B-SFT-DPO
Datasheet
All data sources are well-documented.
https://huggingface.co/datasets/allenai/dolmahttps://huggingface.co/datasets/allenai/tulu-v2-sft-mixturehttps://huggingface.co/datasets/teknium/OpenHermes-2.5https://huggingface.co/datasets/TIGER-Lab/WebInstructSubhttps://huggingface.co/datasets/m-a-p/Code-Feedbackhttps://huggingface.co/datasets/argilla/ultrafeedback-binarized-preferences-cleaned
Access
Licenses
Licensed under Apache-2.0, an OSI-approved license.
https://huggingface.co/amd/AMD-OLMo-1B-SFT-DPO#license
Is this information not up to date?
Contribute here ->
Supported by the Centre for Language Studies and the Dutch Research Council. Website design & development © 2024 by BSTN. This version of the index generated 09 April 2026, website content last updated 11 March 2026.