European Open Source AI Index
Menu
Database
News
Guides
About
Contribute
Zephyr
by HuggingFace
About the model:
Open Mistral-based MoE model.
Model type:
Text
Model performance class:
Full
Link to the model:
https://huggingface.co/HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1
Base models:
Mixtral-8x22B-v0.1
End model:
Zephyr-ORPO-141B-A35B-v0.1
End model license:
Apache-2.0
About the organisation:
HuggingFace, a large AI platform.
Link to the organisation:
https://www.huggingface.co/
Model release date:
October 2023
Availability
Base Model Data
Mistral has not disclosed anything about its training data.
https://huggingface.co/mistralai/Mistral-7B-v0.1/discussions/8
End User Model Data
UltraChat and openbmb/UltraFeedback
https://huggingface.co/HuggingFaceH4/zephyr-7b-beta
Base Model Weights
Weights available through HuggingFace.
https://huggingface.co/mistralai/Mixtral-8x22B-v0.1
End User Model Weights
Weights made available on HuggingFace.
https://huggingface.co/HuggingFaceH4/zephyr-7b-beta
Training Code
Documentation
Code Documentation
https://huggingface.co/HuggingFaceH4/zephyr-7b-beta
Hardware Architecture
Preprint
https://arxiv.org/pdf/2310.16944
Paper
Paper published in COLM 2024.
https://openreview.net/forum?id=aKkAwZB6JV
Modelcard
https://huggingface.co/HuggingFaceH4/zephyr-7b-beta
Datasheet
No datasheet found.
Access
Licenses
weights under MIT, datasets mixed
Is this information not up to date?
Contribute here ->
Supported by the Centre for Language Studies and the Dutch Research Council. Website design & development © 2024 by
BSTN
. This version of the index generated
09 April 2026
, website content last updated
11 March 2026
.