modeling_florence2.py incompatible with transformers>=4.54.0: _supports_sdpa AttributeError
#115
by j2gg0s - opened
Bug
Florence2ForConditionalGeneration fails to load with transformers==4.54.0:
model = AutoModelForCausalLM.from_pretrained("microsoft/Florence-2-large", trust_remote_code=True)
# AttributeError: 'Florence2ForConditionalGeneration' object has no attribute '_supports_sdpa'
Root Cause
In modeling_florence2.py, Florence2PreTrainedModel._supports_sdpa is a @property that delegates to
self.language_model._supports_sdpa. But transformers 4.54.0 checks self._supports_sdpa inside
PreTrainedModel.__init__(), before self.language_model is assigned in the subclass __init__.
Suggested Fix
Add hasattr guard to the property:
class Florence2PreTrainedModel(PreTrainedModel):
@property
def _supports_sdpa(self):
if not hasattr(self, 'language_model'):
return True
return self.language_model._supports_sdpa
@property
def _supports_flash_attn_2(self):
if not hasattr(self, 'language_model'):
return True
return self.language_model._supports_flash_attn_2
Environment
- transformers==4.54.0
- torch==2.x
- Python 3.10
bump. same issue here.
The models were converted in this repo: https://huggingface.co/florence-community
I just switched and it works with latests versions of transformers