@misc{coreteam2025mimounlockingreasoningpotential,
      title={MiMo: Unlocking the Reasoning Potential of Language Model -- From Pretraining to Posttraining}, 
      author={{Xiaomi LLM-Core Team}},
      year={2025},
      eprint={2505.07608},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2505.07608}, 
}
Downloads last month
101
Safetensors
Model size
2B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Paper for R-Kentaren/MiMo-7B-RL-Zero