YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)
EAA Fusion Head (HuBERT + WavLM β Gemma-3-270M)
This repository contains the fusion/regression head and config for the EAA system.
Adapters (same overall model):
- HuBERT LoRA adapter:
marccgrau/EEA_hubert_adapter - WavLM LoRA adapter:
marccgrau/EEA_wavlm_adapter - Gemma-3-270M LoRA adapter:
marccgrau/EEA_gemma3_270m_adapter_hubert_wavlm
It fuses HuBERT & WavLM sequences via dual cross-attention, reduces to N tokens, and conditions Gemma (via LoRA) to regress an aggression score in [1..10].
Inference Providers NEW
This model isn't deployed by any Inference Provider. π Ask for provider support