d20536cf8a4db9bb7935c120ee347d92
This model is a fine-tuned version of google-t5/t5-large on the Helsinki-NLP/opus_books [it-pt] dataset. It achieves the following results on the evaluation set:
- Loss: 1.7724
- Data Size: 1.0
- Epoch Runtime: 21.1339
- Bleu: 5.7456
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- num_epochs: 50
Training results
| Training Loss | Epoch | Step | Validation Loss | Data Size | Epoch Runtime | Bleu |
|---|---|---|---|---|---|---|
| No log | 0 | 0 | 3.8673 | 0 | 2.1007 | 0.7957 |
| No log | 1 | 29 | 3.7587 | 0.0078 | 2.3656 | 0.8667 |
| No log | 2 | 58 | 3.3359 | 0.0156 | 3.9018 | 1.2813 |
| No log | 3 | 87 | 3.1482 | 0.0312 | 5.7201 | 1.4520 |
| No log | 4 | 116 | 2.9528 | 0.0625 | 6.9152 | 1.6262 |
| No log | 5 | 145 | 2.5550 | 0.125 | 9.0873 | 3.0395 |
| 0.2895 | 6 | 174 | 2.2419 | 0.25 | 12.9129 | 3.6679 |
| 0.2895 | 7 | 203 | 2.1014 | 0.5 | 16.0726 | 3.4311 |
| 0.2895 | 8.0 | 232 | 1.9688 | 1.0 | 22.7370 | 3.8893 |
| 1.4182 | 9.0 | 261 | 1.8872 | 1.0 | 21.3748 | 4.9628 |
| 1.4182 | 10.0 | 290 | 1.8393 | 1.0 | 20.9844 | 4.8081 |
| 1.8728 | 11.0 | 319 | 1.8147 | 1.0 | 20.7618 | 5.0388 |
| 1.8728 | 12.0 | 348 | 1.7891 | 1.0 | 20.4942 | 5.2640 |
| 1.7288 | 13.0 | 377 | 1.7681 | 1.0 | 20.2467 | 5.3212 |
| 1.5867 | 14.0 | 406 | 1.7630 | 1.0 | 21.3503 | 5.4210 |
| 1.5867 | 15.0 | 435 | 1.7548 | 1.0 | 20.7495 | 5.5020 |
| 1.4762 | 16.0 | 464 | 1.7474 | 1.0 | 19.7304 | 5.5773 |
| 1.4762 | 17.0 | 493 | 1.7562 | 1.0 | 21.5750 | 5.6653 |
| 1.3933 | 18.0 | 522 | 1.7485 | 1.0 | 20.7114 | 5.7272 |
| 1.3097 | 19.0 | 551 | 1.7612 | 1.0 | 20.4159 | 5.6989 |
| 1.3097 | 20.0 | 580 | 1.7724 | 1.0 | 21.1339 | 5.7456 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.2.0
- Tokenizers 0.22.1
- Downloads last month
- -
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for contemmcm/d20536cf8a4db9bb7935c120ee347d92
Base model
google-t5/t5-large