European Open Source AI Index
DatabaseNewsGuidesAboutContribute

T5

by Google AI

This entry collates T5 base, T5X, and FLAN T5
Text
Limited
https://huggingface.co/docs/transformers/en/model_doc/t5
T5
T5X
Apache-2.0
Major technology company, operator of Google Search.
https://ai.google
October 2019
Availability
Base Model Data
C4; pretraining T5X on The Pile and for finetuning on SQuAD and MNLI.
https://github.com/google-research/t5xhttps://github.com/google-research/text-to-text-transfer-transformer/blob/main/README.md#dataset-preparation
End User Model Data
finetuning on SQuAD and MNLI
Base Model Weights
https://github.com/google-research/text-to-text-transfer-transformer?tab=readme-ov-file#released-model-checkpoints
End User Model Weights
https://github.com/google-research/t5x
Training Code
https://github.com/google-research/t5x
Documentation
Code Documentation
https://github.com/google-research/t5x
Hardware Architecture
https://arxiv.org/pdf/1910.10683
Preprint
https://arxiv.org/pdf/1910.10683
Paper
http://www.jmlr.org/papers/v21/20-074.html
Modelcard
https://huggingface.co/docs/transformers/en/model_doc/t5https://huggingface.co/google-t5/t5-11b
Datasheet
https://huggingface.co/google-t5/t5-11b#training-details
Access
Package
API and Meta Prompts
Licenses
weights under apache 2.0, base model data unclear
Is this information not up to date?
Contribute here ->
Supported by the Centre for Language Studies and the Dutch Research Council. Website design & development © 2024 by BSTN. This version of the index generated 10 March 2026, website content last updated 11 March 2026.