segformer-b1-GFB-NF-512-1024

This model is a fine-tuned version of nvidia/mit-b1 on the segments/GFB dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6972
  • Mean Iou: 0.5572
  • Mean Accuracy: 0.6875
  • Overall Accuracy: 0.8653
  • Accuracy Unlabeled: 0.9296
  • Accuracy Gbm: 0.7701
  • Accuracy Podo: 0.6031
  • Accuracy Endo: 0.4472
  • Iou Unlabeled: 0.8707
  • Iou Gbm: 0.5719
  • Iou Podo: 0.4445
  • Iou Endo: 0.3417

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 250
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Gbm Accuracy Podo Accuracy Endo Iou Unlabeled Iou Gbm Iou Podo Iou Endo
0.9456 1.0526 100 1.0895 0.3047 0.4705 0.7199 0.7850 0.9252 0.1442 0.0277 0.7493 0.3392 0.1054 0.0249
0.5926 2.1053 200 0.6151 0.3885 0.4834 0.8305 0.9514 0.7433 0.2228 0.0161 0.8441 0.5056 0.1886 0.0157
0.5148 3.1579 300 0.4627 0.4671 0.5747 0.8473 0.9395 0.7066 0.5266 0.1261 0.8597 0.5320 0.3614 0.1153
0.2959 4.2105 400 0.4571 0.4587 0.5350 0.8429 0.9715 0.5666 0.3330 0.2688 0.8439 0.4839 0.2852 0.2218
0.3072 5.2632 500 0.4246 0.5276 0.6815 0.8472 0.9031 0.8183 0.5609 0.4435 0.8563 0.5431 0.3966 0.3143
0.2677 6.3158 600 0.3923 0.5335 0.6337 0.8652 0.9540 0.6837 0.5495 0.3475 0.8705 0.5577 0.4148 0.2911
0.2468 7.3684 700 0.4119 0.5445 0.6949 0.8527 0.9082 0.7640 0.6493 0.4583 0.8589 0.5516 0.4257 0.3420
0.2456 8.4211 800 0.4212 0.5478 0.6944 0.8575 0.9136 0.7821 0.6369 0.4451 0.8649 0.5666 0.4297 0.3298
0.2206 9.4737 900 0.4098 0.5433 0.6663 0.8621 0.9381 0.7338 0.5410 0.4524 0.8688 0.5613 0.4090 0.3341
0.183 10.5263 1000 0.4265 0.5421 0.6582 0.8646 0.9414 0.7649 0.5134 0.4130 0.8719 0.5699 0.3997 0.3271
0.1808 11.5789 1100 0.4287 0.5491 0.6767 0.8635 0.9304 0.7711 0.5861 0.4192 0.8697 0.5709 0.4311 0.3246
0.1513 12.6316 1200 0.4276 0.5436 0.6613 0.8631 0.9381 0.7362 0.5696 0.4015 0.8686 0.5608 0.4240 0.3210
0.1532 13.6842 1300 0.4409 0.5557 0.6879 0.8632 0.9293 0.7460 0.6066 0.4698 0.8685 0.5640 0.4393 0.3510
0.1929 14.7368 1400 0.4531 0.5416 0.6609 0.8613 0.9425 0.6989 0.5376 0.4648 0.8667 0.5474 0.4129 0.3396
0.1722 15.7895 1500 0.4607 0.5497 0.6938 0.8581 0.9177 0.7781 0.6006 0.4786 0.8634 0.5615 0.4317 0.3424
0.0925 16.8421 1600 0.4733 0.5450 0.6895 0.8571 0.9143 0.8035 0.5993 0.4409 0.8643 0.5607 0.4282 0.3267
0.1685 17.8947 1700 0.4764 0.5487 0.6792 0.8600 0.9319 0.7100 0.5906 0.4843 0.8639 0.5579 0.4245 0.3483
0.1369 18.9474 1800 0.4843 0.5507 0.6834 0.8620 0.9303 0.7545 0.5686 0.4803 0.8679 0.5658 0.4236 0.3455
0.1281 20.0 1900 0.4728 0.5561 0.7008 0.8607 0.9189 0.7776 0.6160 0.4905 0.8657 0.5721 0.4380 0.3487
0.1207 21.0526 2000 0.4867 0.5523 0.6780 0.8645 0.9329 0.7500 0.6021 0.4269 0.8699 0.5676 0.4401 0.3316
0.1395 22.1053 2100 0.4914 0.5530 0.6786 0.8655 0.9322 0.7766 0.5858 0.4198 0.8721 0.5723 0.4354 0.3322
0.137 23.1579 2200 0.5332 0.5431 0.6948 0.8533 0.9042 0.8170 0.6288 0.4291 0.8589 0.5581 0.4324 0.3229
0.1078 24.2105 2300 0.5052 0.5534 0.6923 0.8618 0.9264 0.7551 0.5925 0.4952 0.8667 0.5673 0.4393 0.3404
0.1127 25.2632 2400 0.5219 0.5508 0.6767 0.8632 0.9323 0.7490 0.5894 0.4363 0.8670 0.5696 0.4349 0.3319
0.1172 26.3158 2500 0.5395 0.5473 0.6888 0.8580 0.9230 0.7481 0.5894 0.4945 0.8633 0.5542 0.4329 0.3387
0.0976 27.3684 2600 0.5311 0.5497 0.6793 0.8625 0.9306 0.7600 0.5756 0.4510 0.8683 0.5668 0.4268 0.3371
0.1128 28.4211 2700 0.5346 0.5423 0.6641 0.8613 0.9365 0.7460 0.5340 0.4401 0.8665 0.5617 0.4082 0.3329
0.0711 29.4737 2800 0.5315 0.5531 0.6854 0.8625 0.9283 0.7466 0.6137 0.4528 0.8678 0.5624 0.4421 0.3401
0.0795 30.5263 2900 0.5507 0.5564 0.7016 0.8608 0.9162 0.7929 0.6300 0.4672 0.8655 0.5718 0.4485 0.3397
0.1243 31.5789 3000 0.5455 0.5575 0.6928 0.8646 0.9284 0.7676 0.5984 0.4769 0.8707 0.5770 0.4363 0.3460
0.1078 32.6316 3100 0.5397 0.5566 0.6921 0.8632 0.9289 0.7477 0.5979 0.4938 0.8681 0.5693 0.4373 0.3518
0.0867 33.6842 3200 0.5545 0.5584 0.6949 0.8643 0.9251 0.7761 0.6175 0.4611 0.8698 0.5731 0.4478 0.3431
0.1081 34.7368 3300 0.5504 0.5561 0.6899 0.8637 0.9266 0.7720 0.6050 0.4559 0.8693 0.5724 0.4379 0.3446
0.0761 35.7895 3400 0.5615 0.5504 0.6774 0.8633 0.9310 0.7592 0.5920 0.4273 0.8687 0.5666 0.4357 0.3307
0.0858 36.8421 3500 0.5828 0.5496 0.6765 0.8629 0.9311 0.7647 0.5763 0.4338 0.8685 0.5638 0.4307 0.3353
0.0761 37.8947 3600 0.5640 0.5507 0.6736 0.8638 0.9353 0.7406 0.5807 0.4379 0.8677 0.5681 0.4318 0.3351
0.0808 38.9474 3700 0.5691 0.5508 0.6771 0.8643 0.9312 0.7680 0.5937 0.4154 0.8695 0.5713 0.4391 0.3235
0.149 40.0 3800 0.5780 0.5484 0.6748 0.8623 0.9333 0.7538 0.5556 0.4565 0.8678 0.5639 0.4185 0.3434
0.0779 41.0526 3900 0.5739 0.5545 0.6942 0.8619 0.9210 0.7903 0.6081 0.4576 0.8672 0.5704 0.4442 0.3361
0.0839 42.1053 4000 0.5863 0.5541 0.6884 0.8632 0.9255 0.7768 0.6065 0.4447 0.8686 0.5696 0.4436 0.3346
0.1068 43.1579 4100 0.5705 0.5573 0.7025 0.8615 0.9186 0.7766 0.6337 0.4812 0.8672 0.5693 0.4477 0.3450
0.0843 44.2105 4200 0.5941 0.5583 0.6935 0.8647 0.9262 0.7757 0.6139 0.4580 0.8697 0.5747 0.4494 0.3394
0.0657 45.2632 4300 0.5937 0.5565 0.6929 0.8627 0.9239 0.7592 0.6378 0.4509 0.8675 0.5679 0.4506 0.3400
0.0651 46.3158 4400 0.6141 0.5527 0.6814 0.8638 0.9297 0.7756 0.5812 0.4390 0.8690 0.5693 0.4356 0.3369
0.0922 47.3684 4500 0.6089 0.5558 0.6848 0.8645 0.9311 0.7540 0.6024 0.4519 0.8693 0.5713 0.4401 0.3425
0.0528 48.4211 4600 0.6147 0.5529 0.6852 0.8628 0.9249 0.7833 0.6074 0.4252 0.8677 0.5679 0.4453 0.3307
0.0739 49.4737 4700 0.6108 0.5592 0.6998 0.8626 0.9216 0.7692 0.6309 0.4774 0.8674 0.5705 0.4471 0.3519
0.1017 50.5263 4800 0.6339 0.5526 0.6784 0.8644 0.9327 0.7681 0.5707 0.4421 0.8691 0.5689 0.4341 0.3382
0.0791 51.5789 4900 0.6122 0.5523 0.6827 0.8627 0.9291 0.7638 0.5851 0.4530 0.8673 0.5657 0.4370 0.3390
0.0643 52.6316 5000 0.6261 0.5530 0.6855 0.8630 0.9265 0.7765 0.5970 0.4420 0.8681 0.5669 0.4420 0.3351
0.0701 53.6842 5100 0.6176 0.5572 0.6937 0.8634 0.9248 0.7692 0.6202 0.4605 0.8685 0.5712 0.4467 0.3422
0.0665 54.7368 5200 0.6148 0.5555 0.6853 0.8649 0.9295 0.7793 0.5881 0.4443 0.8704 0.5720 0.4397 0.3399
0.0954 55.7895 5300 0.6239 0.5540 0.6801 0.8653 0.9331 0.7636 0.5856 0.4382 0.8704 0.5719 0.4396 0.3339
0.0642 56.8421 5400 0.6293 0.5567 0.6893 0.8639 0.9271 0.7679 0.6107 0.4517 0.8688 0.5681 0.4462 0.3435
0.069 57.8947 5500 0.6354 0.5569 0.6906 0.8638 0.9264 0.7713 0.6087 0.4560 0.8691 0.5694 0.4434 0.3456
0.068 58.9474 5600 0.6421 0.5561 0.6872 0.8646 0.9289 0.7702 0.6022 0.4477 0.8699 0.5705 0.4455 0.3384
0.1144 60.0 5700 0.6376 0.5549 0.6849 0.8645 0.9288 0.7700 0.6082 0.4326 0.8698 0.5710 0.4456 0.3333
0.0746 61.0526 5800 0.6433 0.5531 0.6755 0.8658 0.9351 0.7633 0.5809 0.4229 0.8709 0.5707 0.4390 0.3317
0.0702 62.1053 5900 0.6516 0.5549 0.6819 0.8654 0.9330 0.7613 0.5878 0.4454 0.8709 0.5705 0.4393 0.3388
0.0739 63.1579 6000 0.6511 0.5549 0.6830 0.8647 0.9322 0.7570 0.5895 0.4534 0.8699 0.5672 0.4404 0.3419
0.0733 64.2105 6100 0.6616 0.5571 0.6894 0.8651 0.9288 0.7727 0.6018 0.4542 0.8708 0.5712 0.4454 0.3411
0.0943 65.2632 6200 0.6595 0.5572 0.6877 0.8653 0.9292 0.7760 0.5998 0.4459 0.8709 0.5732 0.4434 0.3415
0.0826 66.3158 6300 0.6623 0.5553 0.6850 0.8646 0.9299 0.7634 0.6043 0.4426 0.8699 0.5700 0.4432 0.3380
0.0584 67.3684 6400 0.6611 0.5535 0.6858 0.8634 0.9277 0.7715 0.5960 0.4481 0.8688 0.5680 0.4415 0.3355
0.0613 68.4211 6500 0.6670 0.5573 0.6960 0.8631 0.9220 0.7861 0.6184 0.4574 0.8685 0.5709 0.4474 0.3423
0.0885 69.4737 6600 0.6687 0.5564 0.6853 0.8652 0.9308 0.7639 0.6018 0.4449 0.8703 0.5720 0.4433 0.3400
0.0696 70.5263 6700 0.6711 0.5574 0.6920 0.8642 0.9268 0.7741 0.6035 0.4637 0.8699 0.5710 0.4432 0.3456
0.057 71.5789 6800 0.6672 0.5566 0.6884 0.8646 0.9284 0.7711 0.6034 0.4507 0.8698 0.5717 0.4433 0.3417
0.0516 72.6316 6900 0.6752 0.5557 0.6851 0.8651 0.9300 0.7669 0.6061 0.4374 0.8704 0.5708 0.4463 0.3354
0.0537 73.6842 7000 0.6708 0.5564 0.6846 0.8656 0.9319 0.7630 0.5967 0.4469 0.8709 0.5722 0.4425 0.3402
0.0741 74.7368 7100 0.6764 0.5575 0.6885 0.8652 0.9292 0.7663 0.6128 0.4456 0.8706 0.5728 0.4469 0.3398
0.0842 75.7895 7200 0.6727 0.5565 0.6897 0.8643 0.9266 0.7811 0.6025 0.4487 0.8697 0.5727 0.4431 0.3403
0.0849 76.8421 7300 0.6830 0.5574 0.6874 0.8656 0.9298 0.7738 0.6000 0.4459 0.8711 0.5726 0.4447 0.3414
0.0612 77.8947 7400 0.6802 0.5579 0.6856 0.8664 0.9327 0.7617 0.6011 0.4471 0.8718 0.5740 0.4446 0.3411
0.0738 78.9474 7500 0.6790 0.5590 0.6894 0.8660 0.9296 0.7704 0.6105 0.4472 0.8712 0.5736 0.4490 0.3419
0.0575 80.0 7600 0.6868 0.5583 0.6896 0.8655 0.9287 0.7763 0.6038 0.4494 0.8709 0.5730 0.4460 0.3432
0.0665 81.0526 7700 0.6792 0.5579 0.6904 0.8650 0.9278 0.7752 0.6071 0.4515 0.8704 0.5724 0.4463 0.3424
0.0725 82.1053 7800 0.6867 0.5580 0.6904 0.8650 0.9278 0.7690 0.6165 0.4484 0.8704 0.5718 0.4480 0.3418
0.074 83.1579 7900 0.6843 0.5573 0.6914 0.8642 0.9263 0.7739 0.6131 0.4524 0.8698 0.5710 0.4459 0.3424
0.0626 84.2105 8000 0.6923 0.5575 0.6894 0.8650 0.9284 0.7709 0.6069 0.4515 0.8705 0.5718 0.4447 0.3430
0.0667 85.2632 8100 0.6884 0.5572 0.6871 0.8653 0.9300 0.7723 0.5953 0.4507 0.8707 0.5720 0.4420 0.3441
0.048 86.3158 8200 0.6932 0.5572 0.6853 0.8658 0.9317 0.7651 0.5976 0.4466 0.8711 0.5722 0.4429 0.3426
0.0649 87.3684 8300 0.6918 0.5564 0.6839 0.8656 0.9318 0.7650 0.5979 0.4409 0.8708 0.5718 0.4432 0.3399
0.0602 88.4211 8400 0.6869 0.5582 0.6903 0.8651 0.9282 0.7721 0.6088 0.4522 0.8705 0.5723 0.4463 0.3439
0.0678 89.4737 8500 0.6896 0.5570 0.6863 0.8654 0.9305 0.7677 0.6021 0.4451 0.8708 0.5723 0.4442 0.3407
0.0661 90.5263 8600 0.6943 0.5575 0.6881 0.8653 0.9298 0.7687 0.6019 0.4522 0.8707 0.5725 0.4439 0.3430
0.0712 91.5789 8700 0.6962 0.5572 0.6882 0.8650 0.9289 0.7705 0.6067 0.4467 0.8704 0.5719 0.4450 0.3415
0.061 92.6316 8800 0.6919 0.5567 0.6873 0.8649 0.9291 0.7699 0.6050 0.4453 0.8703 0.5716 0.4443 0.3407
0.0665 93.6842 8900 0.6951 0.5562 0.6843 0.8655 0.9310 0.7697 0.5963 0.4403 0.8708 0.5719 0.4426 0.3397
0.0725 94.7368 9000 0.6954 0.5568 0.6866 0.8652 0.9298 0.7717 0.5997 0.4452 0.8707 0.5719 0.4435 0.3412
0.0612 95.7895 9100 0.6919 0.5571 0.6864 0.8654 0.9306 0.7672 0.6003 0.4477 0.8708 0.5720 0.4438 0.3418
0.0679 96.8421 9200 0.6985 0.5567 0.6854 0.8655 0.9309 0.7685 0.5982 0.4442 0.8709 0.5720 0.4432 0.3407
0.0712 97.8947 9300 0.6977 0.5568 0.6869 0.8652 0.9298 0.7705 0.6005 0.4470 0.8706 0.5716 0.4436 0.3414
0.0669 98.9474 9400 0.6917 0.5572 0.6871 0.8653 0.9300 0.7698 0.6007 0.4479 0.8708 0.5718 0.4439 0.3422
0.0714 100.0 9500 0.6972 0.5572 0.6875 0.8653 0.9296 0.7701 0.6031 0.4472 0.8707 0.5719 0.4445 0.3417

Framework versions

  • Transformers 4.57.1
  • Pytorch 2.9.1+cu130
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
10
Safetensors
Model size
13.7M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for luoyun75579/segformer-b1-GFB-NF-512-1024

Base model

nvidia/mit-b1
Finetuned
(23)
this model

Evaluation results