The IndexGuidesNews
AboutContribute

Parameter descriptions:

Base Model Data
Are datasources for training the base model comprehensively documented and made available? In case a distinction between base (foundation) and end (user) model is not applicable, this mirrors the end model data entries.
End User Model Data
Are datasources for training the model that the end user interacts with comprehensively documented and made available?
Base Model Weights
Are the weights of the base models made freely available? In case a distinction between base (foundation) and end (user) model is not applicable, this mirrors the end model data entries.
End User Model Weights
Are the weights of the model that the end user interacts with made freely available?
Training Code
Is the source code of dataset processing, model training and tuning comprehensively made available?
Code Documentation
Is the source code of datasource processing, model training and tuning comprehensively documented?
Hardware Architecture
Is the hardware architecture used for datasource processing and model training comprehensively documented?
Preprint
Are archived preprint(s) are available that detail all major parts of the system including datasource processing, model training and tuning steps?
Paper
Are peer-reviewed scientific publications available that detail all major parts of the system including datasource processing, model training and tuning steps?
Modelcard
Is a model card available in standardized format that provides comprehensive insight on model architecture, training, fine-tuning, and evaluation?
Datasheet
Is a datasheet as defined in "Datasheets for Datasets" (Gebru et al. 2021) available?
Package
Is a packaged release of the model available on a software repository (e.g. a Python Package Index, Homebrew)?
API and Meta Prompts
Is an API available that provides unrestricted access to the model (other than security and CDN restrictions)? If applicable, this entry also collects information on the use and availability of meta prompts.
Licenses
Is the project fully covered by Open Source Initiative (OSI)-approved licenses, including all data sources and training pipeline code?

Nanbeige

by Nanbeige LLM lab

Chinese text-only LLM. Claims good performance on writing.
Text
Limited
https://huggingface.co/Nanbeige/Nanbeige4-3B-Thinking-2511
Nanbeige4-3B-Base
Nanbeige4-3B-Thinking-2511
Apache-2.0
LLM lab.
https://huggingface.co/Nanbeige
November 2025
Availability
Base Model Data
No information regarding model data
End User Model Data
No information regarding model data
Base Model Weights
Model weights made available on HuggingFace.
https://huggingface.co/Nanbeige/Nanbeige4-3B-Base
End User Model Weights
Model weights made available on HuggingFace.
https://huggingface.co/Nanbeige/Nanbeige4-3B-Thinking-2511
Training Code
Training code deleted
Documentation
Code Documentation
No documentation of the codebase
Hardware Architecture
Hardware architecture not clearly specified
Preprint
Technical report published on HuggingFace.
https://huggingface.co/Nanbeige/Nanbeige4-3B-Thinking-2511/blob/main/Nanbeige4-3B-Technical-Report.pdf
Paper
No peer-reviewed paper found
Modelcard
Model card primarily contains eval results, with some limitations mentioned.
https://huggingface.co/Nanbeige/Nanbeige4-3B-Thinking-2511
Datasheet
No datasheet found
Access
Package
No package found
API and Meta Prompts
No API found.
Licenses
Apache 2.0, an OSI-approved license.
Is this information not up to date?
Contribute here ->

Supported by the Centre for Language Studies and the Dutch Research Council. Website design & development © 2024 by BSTN. This version of the index generated 15 Dec 2025, website content last updated 13 Dec 2025.