AttributeError: 'list' object has no attribute 'keys' on Mac M4
Environment
- Model: EXAONE-3.5-2.4B-Instruct
- Device: Mac M4 Pro (Apple Silicon), 24GB RAM
- OS: macOS
- Python: 3.13.2
- transformers: 5.0.0
- torch: 2.10.0
Issue
When loading the model on Mac M4, I get the following error:
AttributeError: 'list' object has no attribute 'keys'
File: .../transformers/modeling_utils.py:2452
Code to reproduce
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "LGAI-EXAONE/EXAONE-3.5-2.4B-Instruct"
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(
model_name,
low_cpu_mem_usage=True,
trust_remote_code=True
)
Notes
- This error occurs specifically on Apple Silicon (M4)
- EXAONE-3.0-7.8B works fine on the same machine
- Other models (Llama, Sentence Transformers) work without issues
- Both MPS and CPU backends produce the same error
Root cause identified: _tied_weights_keys type mismatch (list vs mapping)
I tracked the crash down to a type mismatch in EXAONE’s remote modeling code during Transformers weight-tying init.
What happens
ExaoneForCausalLM._tied_weights_keysis defined as a list:['lm_head.weight']- During
post_init(), Transformers callsget_expanded_tied_weights_keys(), which expects a dict-like mapping and calls.keys()/.values() - This raises:
AttributeError: 'list' object has no attribute 'keys'
Evidence (introspection)
ModelCls: ExaoneForCausalLM
_tied_weights_keys: type=<class 'list'> value=['lm_head.weight']
.../modeling_exaone.py", line 999, in __init__
self.post_init()
.../transformers/modeling_utils.py", line 1343, in post_init
self.all_tied_weights_keys = self.get_expanded_tied_weights_keys(...)
.../transformers/modeling_utils.py", line 2452, in get_expanded_tied_weights_keys
tied_mapping.keys() | tied_mapping.values()
AttributeError: 'list' object has no attribute 'keys'
Suggestion
Please update the EXAONE remote code to provide tied-weights metadata in the expected format (dict/mapping), or override the tying API so it aligns with Transformers get_expanded_tied_weights_keys() expectations.
(Full repro + environment + screenshots are already in this Discussion #9 thread.)
Update with verification:
I've confirmed the issue is specific to EXAONE 3.5 2.4B by testing other models on the same Mac M4 Pro:
✅ Qwen/Qwen2.5-7B-Instruct - Loaded successfully
✅ beomi/Llama-3-Open-Ko-8B - Loaded successfully
✅ sentence-transformers - Works fine
❌ EXAONE-3.5-2.4B-Instruct - AttributeError
All tests used:
- Same environment (transformers 5.0.0, torch 2.10.0)
- Same hardware (Mac M4 Pro, 24GB RAM)
- Same Python version (3.13.2)
This proves the issue is model-specific, not an environment problem.
the other sells are same as previous one ( as i uploaded )
Version Testing Confirmation
I want to clarify that I've already tested multiple transformers versions to rule out version-related issues:
Tested Versions:
- transformers 4.46.0: ❌ AttributeError (same error)
- transformers 5.0.0: ❌ AttributeError (same error)
Control Tests (same environment):
- Qwen/Qwen2.5-7B-Instruct: ✅ Works
- beomi/Llama-3-Open-Ko-8B: ✅ Works
All tests used:
- Mac M4 Pro, 24GB RAM
- Python 3.13.2
- torch 2.10.0
The error consistently occurs at:modeling_utils.py:2452 in get_expanded_tied_weights_keys
Root cause confirmed: _tied_weights_keys is a list, but Transformers expects a dict-like mapping.
Hello, @Bias92 . Thank you for your attention and your work.
The EXAONE 3.5 relies custom modeling via trust_remote_code=True, which is currently outdated for Transformers v5.0.0 and later.
We encountered same errors you described, and the modeling code would need to be refactored to work properly.
As an alternative, we recommend "llamafying" the model, as others have done, or using an already converted version directly, for example:
https://huggingface.co/beomi/EXAONE-3.5-2.4B-Instruct-Llamafied
Hi @nuxlear ,
Thank you for confirming the issue!
Would it be okay if I submit a PR to fix the
_tied_weights_keys compatibility issue with
Transformers v5.0.0+?
I'd be happy to contribute if it would be helpful.
Definitely! We really appreciate your contributions 😀
On our side, we’ll update the modeling code to make sure EXAONE 3.5 works properly with Transformers v5.0.0+, using the latest Transformers’ features.
We’ll let you know once the updates are ready.
Thanks so much for this opportunity!
As a student, this is the greatest honor and incredible chance to contribute 🙏
I'll prepare a draft PR for the _tied_weights_keys fix.
Once your modeling code updates are ready, I'll test and adapt the patch accordingly.
Really appreciate your work on EXAONE! 🔥🔥
I've submitted a PR for the fix:
https://huggingface.co/LGAI-EXAONE/EXAONE-3.5-2.4B-Instruct/discussions/10
We’ve updated the modeling and config code for Transformers v5. Please check the model again!
Thanks for the update! I re-tested on my side and confirmed the previous _tied_weights_keys issue is resolved.
However, I’m now seeing a new crash in model.generate():AttributeError: 'AttentionInterface' object has no attribute 'get_interface'.
Full details + repro + screenshots are here:
https://huggingface.co/LGAI-EXAONE/EXAONE-3.5-2.4B-Instruct/discussions/11






