European Open Source AI Index
DatabaseNewsGuidesAboutContribute

OpenMoE

by Zheng Zian

An early model aiming to ignite the open-source MoE community.
Text
Limited
https://huggingface.co/OrionZheng/openmoe-8b-chat
OpenMoE-8B
OpenMoE-8B-Chat
Apache-2.0
Member of a student team at the National University of Singapore.
https://huggingface.co/OrionZheng
July 2023
Availability
Base Model Data
https://huggingface.co/OrionZheng/openmoe-8b-chat
End User Model Data
RadPajama and The Stack mentioned
https://huggingface.co/OrionZheng/openmoe-8b-chat
Base Model Weights
https://huggingface.co/hpcai-tech/openmoe-8B
End User Model Weights
https://huggingface.co/OrionZheng/openmoe-8b-chat
Training Code
some code on Huggingface
Documentation
Code Documentation
code on Hugginface is documented
Hardware Architecture
Preprint
Preprint published on arXiv.
https://arxiv.org/pdf/2402.01739
Paper
Paper published in ICML.
https://dl.acm.org/doi/10.5555/3692070.3694363
Modelcard
https://huggingface.co/OrionZheng/openmoe-8b-chat
Datasheet
Access
Package
API and Meta Prompts
Licenses
Is this information not up to date?
Contribute here ->
Supported by the Centre for Language Studies and the Dutch Research Council. Website design & development © 2024 by BSTN. This version of the index generated 10 March 2026, website content last updated 11 March 2026.