European Open Source AI Index
Menu
Database
News
Guides
About
Contribute
AMD OLMo
by AMD
About the model:
Version of OLMo trained from scratch on AMD GPUs.
Model type:
Code
Model performance class:
Limited
Link to the model:
https://huggingface.co/amd/AMD-OLMo-1B-SFT-DPO
Base models:
AMD-OLMo-1B
End model:
AMD-OLMo-1B-SFT-DPO
End model license:
Apache-2.0
About the organisation:
AMD, a major chip manufacturer.
Link to the organisation:
https://www.amd.com/en.html
Model release date:
October 2024
Availability
Base Model Data
Data sources used to train the base model shared. Model based on Dolma dataset.
https://huggingface.co/datasets/allenai/dolma
End User Model Data
Model trained on a combination of different datasets.
https://huggingface.co/datasets/allenai/tulu-v2-sft-mixture
https://huggingface.co/datasets/teknium/OpenHermes-2.5
https://huggingface.co/datasets/TIGER-Lab/WebInstructSub
https://huggingface.co/datasets/m-a-p/Code-Feedback
https://huggingface.co/datasets/argilla/ultrafeedback-binarized-preferences-cleaned
Base Model Weights
Weights published on HuggingFace.
https://huggingface.co/amd/AMD-OLMo-1B
End User Model Weights
Weights published on HuggingFace.
https://huggingface.co/amd/AMD-OLMo-1B-SFT-DPO
Training Code
Training code shared on GitHub.
https://github.com/allenai/OLMo
Documentation
Code Documentation
Training code well-documented, and instructions provided on how to set up training in model card.
https://github.com/allenai/OLMo
https://huggingface.co/amd/AMD-OLMo
Hardware Architecture
Hardware architecture shared on model card.
https://huggingface.co/amd/AMD-OLMo-1B-SFT-DPO
Preprint
No preprint available.
https://www.amd.com/en/developer/resources/technical-articles/introducing-the-first-amd-1b-language-model.html
Paper
No peer-reviewed paper found.
Modelcard
Model card provides detail on model architecture, training, fine-tuning, and evaluation.
https://huggingface.co/amd/AMD-OLMo-1B-SFT-DPO
Datasheet
All data sources are well-documented.
https://huggingface.co/datasets/allenai/dolma
https://huggingface.co/datasets/allenai/tulu-v2-sft-mixture
https://huggingface.co/datasets/teknium/OpenHermes-2.5
https://huggingface.co/datasets/TIGER-Lab/WebInstructSub
https://huggingface.co/datasets/m-a-p/Code-Feedback
https://huggingface.co/datasets/argilla/ultrafeedback-binarized-preferences-cleaned
Access
Licenses
Licensed under Apache-2.0, an OSI-approved license.
https://huggingface.co/amd/AMD-OLMo-1B-SFT-DPO#license
Is this information not up to date?
Contribute here ->
Supported by the Centre for Language Studies and the Dutch Research Council. Website design & development © 2024 by
BSTN
. This version of the index generated
09 April 2026
, website content last updated
11 March 2026
.