The IndexGuidesNews
AboutContribute

Parameter descriptions:

Base Model Data
Are datasources for training the base model comprehensively documented and made available? In case a distinction between base (foundation) and end (user) model is not applicable, this mirrors the end model data entries.
End User Model Data
Are datasources for training the model that the end user interacts with comprehensively documented and made available?
Base Model Weights
Are the weights of the base models made freely available? In case a distinction between base (foundation) and end (user) model is not applicable, this mirrors the end model data entries.
End User Model Weights
Are the weights of the model that the end user interacts with made freely available?
Training Code
Is the source code of dataset processing, model training and tuning comprehensively made available?
Code Documentation
Is the source code of datasource processing, model training and tuning comprehensively documented?
Hardware Architecture
Is the hardware architecture used for datasource processing and model training comprehensively documented?
Preprint
Are archived preprint(s) are available that detail all major parts of the system including datasource processing, model training and tuning steps?
Paper
Are peer-reviewed scientific publications available that detail all major parts of the system including datasource processing, model training and tuning steps?
Modelcard
Is a model card available in standardized format that provides comprehensive insight on model architecture, training, fine-tuning, and evaluation?
Datasheet
Is a datasheet as defined in "Datasheets for Datasets" (Gebru et al. 2021) available?
Package
Is a packaged release of the model available on a software repository (e.g. a Python Package Index, Homebrew)?
API and Meta Prompts
Is an API available that provides unrestricted access to the model (other than security and CDN restrictions)? If applicable, this entry also collects information on the use and availability of meta prompts.
Licenses
Is the project fully covered by Open Source Initiative (OSI)-approved licenses, including all data sources and training pipeline code?

Poro

by AMD Silo AI and TurkuNLP and High Performance Language Technologies (HPLT)

Bilingual model trained on Finnish and English.
Text
Full
https://huggingface.co/LumiOpen/Llama-Poro-2-70B-Instruct
Llama-Poro-2-70B-Base
Llama-Poro-2-70B-Instruct
Llama 3.3 Community License Agreement
Silo AI was acquired by AMD in August 2024
[ "https://www.silo.ai", "https://turkunlp.org", "https://hplt-project.org" ]
March 2025
Availability
Base Model Data
Model based on Llama, whose data is nowhere disclosed or documented.
End User Model Data
Finetuning data published on HuggingFace.
https://huggingface.co/datasets/LumiOpen/poro2-instruction-collection
Base Model Weights
Weights made available on HuggingFace.
https://huggingface.co/LumiOpen/Llama-Poro-2-70B-Instruct
End User Model Weights
Weights made available on HuggingFace.
https://huggingface.co/LumiOpen/Llama-Poro-2-70B-Instruct
Training Code
Custom fork of the Megatron-Deepspeed framework used for training Poro-34B.
https://github.com/LumiOpen/Megatron-LM-lumi
Documentation
Code Documentation
Training code and related scripts are publicly available.
https://github.com/LumiOpen/Megatron-LM-lumi/tree/main
Hardware Architecture
Hardware architecture described in blog post.
https://rocm.blogs.amd.com/artificial-intelligence/multilingual-continued-pretraining/README.html
Preprint
No preprint found. Release-by-blogpost.
https://rocm.blogs.amd.com/artificial-intelligence/multilingual-continued-pretraining/README.html
Paper
No peer-reviewed paper found.
Modelcard
Model card provides broad overview and links to full details.
https://huggingface.co/LumiOpen/Llama-Poro-2-70B-Instruct
Datasheet
Datasheet describes sourcing.
https://huggingface.co/datasets/LumiOpen/poro2-instruction-collection
Access
Package
No packages published.
API and Meta Prompts
No API found.
Licenses
Llama 3.3 Community License Agreement, not an OSI recognised open license
https://huggingface.co/LumiOpen/Llama-Poro-2-70B-Instruct#license
Is this information not up to date?
Contribute here ->

Supported by the Centre for Language Studies and the Dutch Research Council. Website design & development © 2024 by BSTN. This version of the index generated 12 Sep 2025, website content last updated 04 Sep 2025.