The IndexGuidesNews
AboutContribute

Parameter descriptions:

Base Model Data
Are datasources for training the base model comprehensively documented and freely made available? In case a distinction between base (foundation) and end (user) model is not applicable, this mirrors the end model data entries.
End User Model Data
Are datasources for training the model that the enduser interacts with comprehensively documented and freely made available?
Base Model Weights
Are the weights of the base models made freely available? In case a distinction between base (foundation) and end (user) model is not applicable, this mirrors the end model data entries.
End User Model Weights
Are the weights of the model that the enduser interacts with made freely available?
Training Code
Is the source code of datasource processing, model training and tuining comprehensively and freely made available?
Code Documentation
Is the source code of datasource processing, model training and tuning comprehensively documented?
Hardware Architecture
Is the hardware architecture used for datasource processing and model training comprehensively documented?
Preprint
Are archived preprint(s) are available that detail all major parts of the system including datasource processing, model training and tuning steps?
Paper
Are peer-reviewed scientific publications available that detail all major parts of the system including datasource processing, model training and tuning steps?
Modelcard
Is a model card in standardized format available that provides comprehensive insight on model architecture, training, fine-tuning, and evaluation are available?
Datasheet
Is a datasheet as defined in "Datasheets for Datasets" (Gebru et al. 2021) available?
Package
Is a packaged release of the model available on a software repository (e.g. a Python Package Index, Homebrew)?
API and Meta Prompts
Is an API available that provides unrestricted access to the model (other than security and CDN restrictions)? If applicable, this entry also collects information on the use and availability of meta prompts.
Licenses
Is the project fully covered by Open Source Initiative (OSI)-approved licenses, including all data sources and training pipeline code?

BLOOMZ

by BigScience Workshop

Large open multilingual language model.
Text
Full
https://huggingface.co/bigscience/bloomz-p3
BLOOM
BLOOMZ-P3
Apache 2.0 and RAIL (responsible AI license)
Research workshop on large multilingual models.
https://bigscience.huggingface.co/
September 2022
Availability
Training Code
Repository provides a guided overview to all components.
https://github.com/bigscience-workshop/xmtf
Base Model Data
Data made available & documented in detail in repo and preprint.
https://github.com/bigscience-workshop/xmtf#data
End User Model Data
From the documentation 'xP3 (Crosslingual Public Pool of Prompts) is a collection of prompts & datasets across 46 of languages & 16 NLP tasks.'
https://huggingface.co/datasets/bigscience/xP3all
Base Model Weights
Weights made available on HuggingFace.
https://huggingface.co/bigscience/bloom
End User Model Weights
Weights made available on HuggingFace.
https://huggingface.co/bigscience/bloomz-p3
Documentation
Code Documentation
Code well-documented.
https://github.com/bigscience-workshop/xmtf
Hardware Architecture
Architecture described in preprint, code available in github repo, recipe on HuggingFace.
https://arxiv.org/pdf/2211.05100https://github.com/bigscience-workshop/xmtf#create-xp3x
Preprint
Preprints for base and fine-tuned models available on arXiv.
https://arxiv.org/abs/2211.05100https://arxiv.org/pdf/2211.01786
Paper
Peer-reviewed paper of 9 pages + 114 page appendix describes the multitask finetuning (instruction tuning) of BLOOM (see preprint) to form BLOOMZ.
https://aclanthology.org/2023.acl-long.891/
Modelcard
Model card provides the requisite detail.
https://huggingface.co/bigscience/bloomz-p3
Datasheet
Dataset documented in dataset card at HuggingFace.
https://huggingface.co/datasets/bigscience/xP3
Access
Package
No packages published.
API and Meta Prompts
Petals API provides access through a HuggingFace Space, however API is currently paused.
https://huggingface.co/spaces/bigscience/petals-api
Licenses
Code licensed under Apache 2.0, model under bespoke 'Responsible AI License' which imposes some limitations
https://bigscience.huggingface.co/blog/the-bigscience-rail-license
Is this information not up to date?
Contribute here ->

Supported by the Centre for Language Studies and the Dutch Research Council. Website design & development © 2024 by BSTN. This version of the index generated 09 Apr 2025, website content last updated 23 Apr 2025.