| --- |
| license: apache-2.0 |
| tags: |
| - generated_from_trainer |
| base_model: openai/whisper-medium |
| datasets: |
| - generator |
| metrics: |
| - wer |
| model-index: |
| - name: whisper-medium-lug |
| results: |
| - task: |
| type: automatic-speech-recognition |
| name: Automatic Speech Recognition |
| dataset: |
| name: generator |
| type: generator |
| config: default |
| split: train |
| args: default |
| metrics: |
| - type: wer |
| value: 61.62227602905569 |
| name: Wer |
| --- |
| |
| <!-- This model card has been generated automatically according to the information the Trainer had access to. You |
| should probably proofread and complete it, then remove this comment. --> |
|
|
| [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/bakera-sunbird/huggingface/runs/c6s8tlgq) |
| # whisper-medium-lug |
|
|
| This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on the generator dataset. |
| It achieves the following results on the evaluation set: |
| - Loss: 0.2943 |
| - Wer: 61.6223 |
|
|
| ## Model description |
|
|
| More information needed |
|
|
| ## Intended uses & limitations |
|
|
| More information needed |
|
|
| ## Training and evaluation data |
|
|
| More information needed |
|
|
| ## Training procedure |
|
|
| ### Training hyperparameters |
|
|
| The following hyperparameters were used during training: |
| - learning_rate: 1e-05 |
| - train_batch_size: 16 |
| - eval_batch_size: 8 |
| - seed: 42 |
| - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
| - lr_scheduler_type: linear |
| - lr_scheduler_warmup_steps: 500 |
| - training_steps: 8000 |
| - mixed_precision_training: Native AMP |
| |
| ### Training results |
| |
| | Training Loss | Epoch | Step | Validation Loss | Wer | |
| |:-------------:|:-------:|:----:|:---------------:|:--------:| |
| | 0.9437 | 0.025 | 200 | 0.4902 | 427.9661 | |
| | 0.4586 | 1.0108 | 400 | 0.3298 | 150.3027 | |
| | 0.3741 | 1.0357 | 600 | 0.3337 | 143.5835 | |
| | 0.2659 | 2.0215 | 800 | 0.2871 | 109.6852 | |
| | 0.139 | 3.0072 | 1000 | 0.3437 | 131.9613 | |
| | 0.1734 | 3.0322 | 1200 | 0.3028 | 170.8838 | |
| | 0.1072 | 4.018 | 1400 | 0.2943 | 61.6223 | |
| | 0.0726 | 5.0038 | 1600 | 0.3438 | 114.7094 | |
| | 0.0751 | 5.0287 | 1800 | 0.3526 | 73.6683 | |
| | 0.0635 | 6.0145 | 2000 | 0.3629 | 159.7458 | |
| | 0.0554 | 7.0003 | 2200 | 0.3854 | 152.1186 | |
| | 0.0549 | 7.0252 | 2400 | 0.3751 | 98.5472 | |
| | 0.0283 | 8.011 | 2600 | 0.3190 | 89.2857 | |
| | 0.0349 | 8.036 | 2800 | 0.3452 | 155.5085 | |
| | 0.0379 | 9.0218 | 3000 | 0.3780 | 109.7458 | |
| | 0.0316 | 10.0075 | 3200 | 0.3880 | 101.4528 | |
| | 0.0232 | 10.0325 | 3400 | 0.4144 | 67.7966 | |
| | 0.0246 | 11.0183 | 3600 | 0.3820 | 71.0654 | |
| | 0.0192 | 12.004 | 3800 | 0.4022 | 107.6877 | |
| | 0.0195 | 12.029 | 4000 | 0.4276 | 126.9976 | |
| | 0.013 | 13.0147 | 4200 | 0.4128 | 115.3753 | |
| | 0.0154 | 14.0005 | 4400 | 0.4371 | 126.6949 | |
| | 0.0109 | 14.0255 | 4600 | 0.4213 | 142.2518 | |
| | 0.0133 | 15.0113 | 4800 | 0.4075 | 170.1574 | |
| | 0.011 | 15.0363 | 5000 | 0.4454 | 116.1622 | |
| | 0.0104 | 16.022 | 5200 | 0.3950 | 79.5400 | |
| | 0.0079 | 17.0078 | 5400 | 0.4330 | 109.2010 | |
| | 0.0083 | 17.0328 | 5600 | 0.4308 | 137.5303 | |
| | 0.0064 | 18.0185 | 5800 | 0.4178 | 96.2470 | |
| | 0.0057 | 19.0042 | 6000 | 0.4104 | 99.7579 | |
| | 0.0076 | 19.0293 | 6200 | 0.4132 | 117.0702 | |
| | 0.0062 | 20.015 | 6400 | 0.4404 | 146.2470 | |
| | 0.0035 | 21.0008 | 6600 | 0.4488 | 128.4504 | |
| | 0.0045 | 21.0257 | 6800 | 0.4415 | 91.0412 | |
| | 0.0043 | 22.0115 | 7000 | 0.4477 | 89.5884 | |
| | 0.0038 | 22.0365 | 7200 | 0.4550 | 82.5666 | |
| | 0.0028 | 23.0222 | 7400 | 0.4451 | 77.4213 | |
| | 0.003 | 24.008 | 7600 | 0.4424 | 78.5109 | |
| | 0.0033 | 24.033 | 7800 | 0.4448 | 73.4867 | |
| | 0.0041 | 25.0188 | 8000 | 0.4455 | 86.4407 | |
| |
| |
| ### Framework versions |
| |
| - Transformers 4.41.0.dev0 |
| - Pytorch 2.2.0 |
| - Datasets 2.16.1 |
| - Tokenizers 0.19.1 |
| |