metadata
license: mit
language:
- en
pipeline_tag: text-generation
arxiv:
- https://arxiv.org/abs/2508.06595
library_name: transformers
Model Details
Best Meta-Llama-3-8B-Instruct checkpoint unlearned using RMU with the Textbook-HP-Simplest forget set. For more details, please check our paper.
sources
- Base model: Meta-Llama-3-8B-Instruct
- Repository: [https://github.com/xyzhu123/Synthetic_Textbook)
Performance
| HP MCQ | tinyMMLU | GSM8k | TriviaQA | |
|---|---|---|---|---|
| Llama-3-8B-Instruct | 75.63 | 59.21 | 75.28 | 51.09 |
| Llama-3-8B-Instruct_RMU_Textbook-HP-Simplest | 30.83 | 55.10 | 74.00 | 49.87 |
Citation
If you find this useful in your research, please consider citing our paper:
@misc{zhu2025llmunlearningexpertcurated,
title={LLM Unlearning Without an Expert Curated Dataset},
author={Xiaoyuan Zhu and Muru Zhang and Ollie Liu and Robin Jia and Willie Neiswanger},
year={2025},
eprint={2508.06595},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2508.06595},
}