Edit model card

NeRUBioS_RoBERTa_base_bne_Training_Development

This model is a fine-tuned version of PlanTL-GOB-ES/roberta-base-bne on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3499
  • Negref Precision: 0.5449
  • Negref Recall: 0.5380
  • Negref F1: 0.5414
  • Neg Precision: 0.9559
  • Neg Recall: 0.9694
  • Neg F1: 0.9626
  • Nsco Precision: 0.8730
  • Nsco Recall: 0.9062
  • Nsco F1: 0.8893
  • Unc Precision: 0.8315
  • Unc Recall: 0.8764
  • Unc F1: 0.8534
  • Usco Precision: 0.6608
  • Usco Recall: 0.7383
  • Usco F1: 0.6974
  • Precision: 0.8205
  • Recall: 0.8453
  • F1: 0.8327
  • Accuracy: 0.9526

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 12

Training results

Training Loss Epoch Step Validation Loss Negref Precision Negref Recall Negref F1 Neg Precision Neg Recall Neg F1 Nsco Precision Nsco Recall Nsco F1 Unc Precision Unc Recall Unc F1 Usco Precision Usco Recall Usco F1 Precision Recall F1 Accuracy
0.1898 1.0 1729 0.1783 0.4516 0.5316 0.4884 0.9351 0.9596 0.9472 0.8079 0.8539 0.8303 0.8193 0.7529 0.7847 0.5816 0.6406 0.6097 0.7596 0.8041 0.7813 0.9452
0.1163 2.0 3458 0.1724 0.4906 0.5527 0.5198 0.9274 0.9760 0.9511 0.8252 0.9026 0.8622 0.8263 0.8263 0.8263 0.5662 0.6680 0.6129 0.7721 0.8376 0.8036 0.9485
0.0621 3.0 5187 0.1946 0.5139 0.5063 0.5101 0.9524 0.9618 0.9571 0.8542 0.8836 0.8687 0.8071 0.8726 0.8386 0.6034 0.6836 0.6410 0.7999 0.8249 0.8122 0.9480
0.0378 4.0 6916 0.2279 0.4923 0.5401 0.5151 0.9450 0.9749 0.9597 0.8568 0.8884 0.8723 0.8259 0.8610 0.8431 0.6179 0.6758 0.6455 0.7940 0.8347 0.8138 0.9490
0.0192 5.0 8645 0.2495 0.5227 0.5338 0.5282 0.9541 0.9760 0.9649 0.8256 0.8884 0.8558 0.8071 0.8726 0.8386 0.6049 0.6758 0.6384 0.7929 0.8351 0.8135 0.9508
0.0134 6.0 10374 0.2764 0.5199 0.5232 0.5216 0.9568 0.9672 0.9620 0.8687 0.8955 0.8819 0.8277 0.8533 0.8403 0.6389 0.7188 0.6765 0.8114 0.8347 0.8229 0.9514
0.0068 7.0 12103 0.2876 0.4880 0.5169 0.5020 0.9470 0.9760 0.9613 0.8593 0.8919 0.8753 0.8494 0.8494 0.8494 0.6456 0.7188 0.6802 0.8010 0.8351 0.8177 0.9508
0.0059 8.0 13832 0.2886 0.4991 0.5591 0.5274 0.9488 0.9705 0.9595 0.8601 0.8907 0.8751 0.8231 0.8803 0.8507 0.6528 0.7344 0.6912 0.7986 0.8446 0.8209 0.9516
0.0029 9.0 15561 0.3290 0.5408 0.4895 0.5138 0.9529 0.9716 0.9622 0.8653 0.9002 0.8824 0.8218 0.8726 0.8464 0.6090 0.7422 0.6690 0.8125 0.8358 0.8240 0.9505
0.0009 10.0 17290 0.3582 0.5438 0.5105 0.5267 0.9519 0.9716 0.9616 0.8757 0.9038 0.8895 0.8218 0.8726 0.8464 0.6737 0.75 0.7098 0.8227 0.8413 0.8319 0.9506
0.0012 11.0 19019 0.3516 0.5139 0.5443 0.5287 0.9539 0.9705 0.9621 0.8834 0.9086 0.8958 0.8291 0.8803 0.8539 0.6761 0.75 0.7111 0.8157 0.8489 0.8320 0.9526
0.0005 12.0 20748 0.3499 0.5449 0.5380 0.5414 0.9559 0.9694 0.9626 0.8730 0.9062 0.8893 0.8315 0.8764 0.8534 0.6608 0.7383 0.6974 0.8205 0.8453 0.8327 0.9526

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
17
Safetensors
Model size
124M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ajtamayoh/NeRUBioS_RoBERTa_base_bne_Training_Development

Finetuned
(28)
this model