Llama3.2-1B-eu continual
Llama-3.2 1B for Basque continually pretrained on ZelaHandi-v1.
Citation
If you use this dataset please cite the following:
@misc{orai2025llama3.2-eu-continual,
title={Llama3.2-3B-eu-continual},
author={Corral, Ander and Urbizu, Gorka and Saralegi, Xabier},
publisher={Orai NLP Technologies},
url={https://huggingface.co/datasets/orai-nlp/Llama3.2-3B-eu-continual}
year={2025}
}
Contact
- Ander Corral ([email protected])
- Gorka Urbizu ([email protected])
- Xabier Saralegi ([email protected])
- Downloads last month
- 8
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for orai-nlp/Llama3.2-3B-eu-continual
Base model
meta-llama/Llama-3.2-3B