European Open Source AI Index
DatabaseNewsGuidesAboutContribute

Fietje

by Bram Vanroy

Derivative of Phi2 fine-tuned for Dutch text generation.
Text
Full
https://huggingface.co/BramVanroy/fietje-2-chat
Phi-2
Fietje-2-Chat
MIT License
Independent model creator.
https://bramvanroy.github.io/
April 2024
Availability
Base Model Data
Based on phi2 by Microsoft, for which pretraining data has not been disclosed. Phi2 documentation says only '250B tokens, combination of NLP synthetic data created by AOAI GPT-3.5 and filtered web data from Falcon RefinedWeb and SlimPajama, which was assessed by AOAI GPT-4.'.
End User Model Data
Finetuning of Phi not fully documented; derivative uses a Wikipedia and CulturaX mix for Dutch.
https://huggingface.co/datasets/BramVanroy/wikipedia_culturax_dutch
Base Model Weights
Phi 2 model weights available through HuggingFace.
https://huggingface.co/microsoft/phi-2
End User Model Weights
https://huggingface.co/BramVanroy/fietje-2-chat
Training Code
Source code for base model not available, but training for derivative well-documented in github repository.
https://github.com/BramVanroy/fietje-2
Documentation
Code Documentation
Source code for base model not available, but training for derivative model documented in exemplary detail.
https://github.com/BramVanroy/fietje-2/tree/main/training
Hardware Architecture
https://huggingface.co/BramVanroy/fietje-2-chat
Preprint
https://arxiv.org/abs/2412.15450
Paper
Modelcard
https://huggingface.co/BramVanroy/fietje-2-chat
Datasheet
No data sheet available for base model
Access
Licenses
Model weights provided under MIT, but training data inherits unclarity and limitations from upstream models (Falcon RefinedWeb, SlimPajama).
Is this information not up to date?
Contribute here ->
Supported by the Centre for Language Studies and the Dutch Research Council. Website design & development © 2024 by BSTN. This version of the index generated 09 April 2026, website content last updated 11 March 2026.