AntoineD commited on
Commit
c409a95
1 Parent(s): 61b34b5

End of training

Browse files
Files changed (2) hide show
  1. README.md +111 -0
  2. adapter_model.bin +1 -1
README.md ADDED
@@ -0,0 +1,111 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: camembert/camembert-base-ccnet
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - accuracy
7
+ model-index:
8
+ - name: camembert_ccnet_classification_tools_qlora_fr
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # camembert_ccnet_classification_tools_qlora_fr
16
+
17
+ This model is a fine-tuned version of [camembert/camembert-base-ccnet](https://huggingface.co/camembert/camembert-base-ccnet) on the None dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 1.4958
20
+ - Accuracy: 0.45
21
+
22
+ ## Model description
23
+
24
+ More information needed
25
+
26
+ ## Intended uses & limitations
27
+
28
+ More information needed
29
+
30
+ ## Training and evaluation data
31
+
32
+ More information needed
33
+
34
+ ## Training procedure
35
+
36
+ ### Training hyperparameters
37
+
38
+ The following hyperparameters were used during training:
39
+ - learning_rate: 0.0001
40
+ - train_batch_size: 24
41
+ - eval_batch_size: 192
42
+ - seed: 42
43
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
+ - lr_scheduler_type: linear
45
+ - num_epochs: 60
46
+
47
+ ### Training results
48
+
49
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
50
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
51
+ | No log | 1.0 | 5 | 2.0916 | 0.075 |
52
+ | No log | 2.0 | 10 | 2.1165 | 0.075 |
53
+ | No log | 3.0 | 15 | 2.1285 | 0.075 |
54
+ | No log | 4.0 | 20 | 2.1210 | 0.075 |
55
+ | No log | 5.0 | 25 | 2.1182 | 0.125 |
56
+ | No log | 6.0 | 30 | 2.0984 | 0.125 |
57
+ | No log | 7.0 | 35 | 2.0764 | 0.15 |
58
+ | No log | 8.0 | 40 | 2.0390 | 0.2 |
59
+ | No log | 9.0 | 45 | 2.0085 | 0.2 |
60
+ | No log | 10.0 | 50 | 1.9749 | 0.175 |
61
+ | No log | 11.0 | 55 | 1.9474 | 0.15 |
62
+ | No log | 12.0 | 60 | 1.9271 | 0.225 |
63
+ | No log | 13.0 | 65 | 1.9089 | 0.225 |
64
+ | No log | 14.0 | 70 | 1.8945 | 0.225 |
65
+ | No log | 15.0 | 75 | 1.8847 | 0.2 |
66
+ | No log | 16.0 | 80 | 1.8638 | 0.25 |
67
+ | No log | 17.0 | 85 | 1.8387 | 0.3 |
68
+ | No log | 18.0 | 90 | 1.8156 | 0.275 |
69
+ | No log | 19.0 | 95 | 1.8003 | 0.3 |
70
+ | No log | 20.0 | 100 | 1.7827 | 0.275 |
71
+ | No log | 21.0 | 105 | 1.7688 | 0.3 |
72
+ | No log | 22.0 | 110 | 1.7467 | 0.275 |
73
+ | No log | 23.0 | 115 | 1.7255 | 0.275 |
74
+ | No log | 24.0 | 120 | 1.7132 | 0.325 |
75
+ | No log | 25.0 | 125 | 1.7007 | 0.35 |
76
+ | No log | 26.0 | 130 | 1.6881 | 0.35 |
77
+ | No log | 27.0 | 135 | 1.6801 | 0.35 |
78
+ | No log | 28.0 | 140 | 1.6642 | 0.375 |
79
+ | No log | 29.0 | 145 | 1.6450 | 0.325 |
80
+ | No log | 30.0 | 150 | 1.6425 | 0.35 |
81
+ | No log | 31.0 | 155 | 1.6305 | 0.375 |
82
+ | No log | 32.0 | 160 | 1.6193 | 0.4 |
83
+ | No log | 33.0 | 165 | 1.6128 | 0.4 |
84
+ | No log | 34.0 | 170 | 1.6027 | 0.4 |
85
+ | No log | 35.0 | 175 | 1.5915 | 0.425 |
86
+ | No log | 36.0 | 180 | 1.5837 | 0.45 |
87
+ | No log | 37.0 | 185 | 1.5721 | 0.45 |
88
+ | No log | 38.0 | 190 | 1.5605 | 0.425 |
89
+ | No log | 39.0 | 195 | 1.5555 | 0.425 |
90
+ | No log | 40.0 | 200 | 1.5521 | 0.425 |
91
+ | No log | 41.0 | 205 | 1.5480 | 0.425 |
92
+ | No log | 42.0 | 210 | 1.5399 | 0.45 |
93
+ | No log | 43.0 | 215 | 1.5276 | 0.45 |
94
+ | No log | 44.0 | 220 | 1.5282 | 0.45 |
95
+ | No log | 45.0 | 225 | 1.5197 | 0.45 |
96
+ | No log | 46.0 | 230 | 1.5175 | 0.45 |
97
+ | No log | 47.0 | 235 | 1.5065 | 0.45 |
98
+ | No log | 48.0 | 240 | 1.5043 | 0.45 |
99
+ | No log | 49.0 | 245 | 1.5019 | 0.45 |
100
+ | No log | 50.0 | 250 | 1.4975 | 0.45 |
101
+ | No log | 51.0 | 255 | 1.4949 | 0.45 |
102
+ | No log | 52.0 | 260 | 1.4969 | 0.45 |
103
+ | No log | 53.0 | 265 | 1.4958 | 0.45 |
104
+
105
+
106
+ ### Framework versions
107
+
108
+ - Transformers 4.34.0
109
+ - Pytorch 2.0.1+cu117
110
+ - Datasets 2.14.5
111
+ - Tokenizers 0.14.1
adapter_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:fa80bd694d663d4ce359b71ef1293e95d7c53cc972a197658b9bdd68bd73d28c
3
  size 7571689
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d7a3528255f6694734ef130dedaf9f8efcb9bc29d22cefac01f8d47a85a738eb
3
  size 7571689