European Open Source AI Index
Menu
Database
News
Guides
About
Contribute
OpenMoE
by Zheng Zian
About the model:
An early model aiming to ignite the open-source MoE community.
Model type:
Text
Model performance class:
Limited
Link to the model:
https://huggingface.co/OrionZheng/openmoe-8b-chat
Base models:
OpenMoE-8B
End model:
OpenMoE-8B-Chat
End model license:
Apache-2.0
About the organisation:
Member of a student team at the National University of Singapore.
Link to the organisation:
https://huggingface.co/OrionZheng
Model release date:
July 2023
Availability
Base Model Data
https://huggingface.co/OrionZheng/openmoe-8b-chat
End User Model Data
RadPajama and The Stack mentioned
https://huggingface.co/OrionZheng/openmoe-8b-chat
Base Model Weights
https://huggingface.co/hpcai-tech/openmoe-8B
End User Model Weights
https://huggingface.co/OrionZheng/openmoe-8b-chat
Training Code
some code on Huggingface
Documentation
Code Documentation
code on Hugginface is documented
Hardware Architecture
Preprint
Preprint published on arXiv.
https://arxiv.org/pdf/2402.01739
Paper
Paper published in ICML.
https://dl.acm.org/doi/10.5555/3692070.3694363
Modelcard
https://huggingface.co/OrionZheng/openmoe-8b-chat
Datasheet
Access
Package
API and Meta Prompts
Licenses
Is this information not up to date?
Contribute here ->
Supported by the Centre for Language Studies and the Dutch Research Council. Website design & development © 2024 by
BSTN
. This version of the index generated
10 March 2026
, website content last updated
11 March 2026
.