Instructions to use oskrmiguel/mt5-simplification-spanish with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use oskrmiguel/mt5-simplification-spanish with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("oskrmiguel/mt5-simplification-spanish") model = AutoModelForSeq2SeqLM.from_pretrained("oskrmiguel/mt5-simplification-spanish") - Notebooks
- Google Colab
- Kaggle
mt5-simplification-spanish
Model description
This is a fine-tuned mt5-small model for generating simple text from complex text.
This model was created with the IXA Group research group of the University of the Basque Country, the model has been evaluated with the Sari, Bleu and Fklg metrics; it was trained and tested using the Simplext corpus.
Dataset
Simplext
Model Evaluation
Bleu: 13,186
Sari: 42,203
Fklg: 10,284
Authors
Oscar M. Cumbicus-Pineda, Itziar Gonzalez-Dios, Aitor Soroa, November 2021
Code
- Downloads last month
- 11
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support