The IndexGuidesNews
AboutContribute

Parameter descriptions:

Base Model Data
Are datasources for training the base model comprehensively documented and freely made available? In case a distinction between base (foundation) and end (user) model is not applicable, this mirrors the end model data entries.
End User Model Data
Are datasources for training the model that the enduser interacts with comprehensively documented and freely made available?
Base Model Weights
Are the weights of the base models made freely available? In case a distinction between base (foundation) and end (user) model is not applicable, this mirrors the end model data entries.
End User Model Weights
Are the weights of the model that the enduser interacts with made freely available?
Training Code
Is the source code of datasource processing, model training and tuining comprehensively and freely made available?
Code Documentation
Is the source code of datasource processing, model training and tuning comprehensively documented?
Hardware Architecture
Is the hardware architecture used for datasource processing and model training comprehensively documented?
Preprint
Are archived preprint(s) are available that detail all major parts of the system including datasource processing, model training and tuning steps?
Paper
Are peer-reviewed scientific publications available that detail all major parts of the system including datasource processing, model training and tuning steps?
Modelcard
Is a model card in standardized format available that provides comprehensive insight on model architecture, training, fine-tuning, and evaluation are available?
Datasheet
Is a datasheet as defined in "Datasheets for Datasets" (Gebru et al. 2021) available?
Package
Is a packaged release of the model available on a software repository (e.g. a Python Package Index, Homebrew)?
API and Meta Prompts
Is an API available that provides unrestricted access to the model (other than security and CDN restrictions)? If applicable, this entry also collects information on the use and availability of meta prompts.
Licenses
Is the project fully covered by Open Source Initiative (OSI)-approved licenses, including all data sources and training pipeline code?

mT0

by bigscience-workshop

Large open multilingual language model.
Text
Full
https://huggingface.co/bigscience/mt0-xxl-p3
mT5-XXL
mT0-XXL-P3
Apache-2.0
Research workshop on large multilingual models.
https://github.com/bigscience-workshop
May 2023
Availability
Training Code
Training procedure described in a guide with some code, but no model-specific repository available
https://github.com/google-research/t5x/blob/main/docs/usage/finetune.md
Base Model Data
mC4, a subset of C4
https://www.tensorflow.org/datasets/catalog/c4#c4multilingual
End User Model Data
xP3
https://github.com/bigscience-workshop/xmtf?tab=readme-ov-file#data
Base Model Weights
https://huggingface.co/bigscience/mt0-xxl-mt
End User Model Weights
various variants available
https://huggingface.co/bigscience/mt0-large
Documentation
Code Documentation
Training procedure
https://arxiv.org/pdf/2010.11934
Hardware Architecture
Architecture: Same as mt5-xxl, also refer to the config.json file
Preprint
https://arxiv.org/abs/2211.01786
Paper
https://virtual2023.aclweb.org/paper_P283.html
Modelcard
https://huggingface.co/bigscience/mt0-xxl
Datasheet
https://huggingface.co/datasets/bigscience/xP3
Access
Package
API and Meta Prompts
none found
Licenses
model weights and finetuning dataset (xP3) and basemodel dataset (mC4) all under apache-2.0
Is this information not up to date?
Contribute here ->

Supported by the Centre for Language Studies and the Dutch Research Council. Website design & development © 2024 by BSTN. This version of the index generated 09 Apr 2025, website content last updated 23 Apr 2025.