LLM-Drop
Collection
Model weights of paper "What Matters in Transformers? Not All Attention is Needed" (https://arxiv.org/abs/2406.15786) • 15 items • Updated • 4
None defined yet.
Demystifying When Pruning Works via Representation Hierarchies
Understanding and Harnessing Sparsity in Unified Multimodal Models