altomek commited on
Commit
ccffc78
1 Parent(s): 2dc7aa5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -86,7 +86,7 @@ Please remember that all CodeRosa-70B-AB1 models operate under the llama2 licens
86
  - [4bpw](https://huggingface.co/altomek/CodeRosa-70B-AB1-4bpw-EXL2)
87
  - [3.92bpw](https://huggingface.co/altomek/CodeRosa-70B-AB1-3.92bpw-EXL2) --> 40GB VRAM
88
  - [3.5bpw](https://huggingface.co/altomek/CodeRosa-70B-AB1-3.5bpw-EXL2)
89
- - [3bpw](https://huggingface.co/altomek/CodeRosa-70B-AB1-3bpw-EXL2)
90
  - [2.4bpw](https://huggingface.co/altomek/CodeRosa-70B-AB1-2.4bpw-EXL2) --> 24GB VRAM
91
  - [measurements](https://huggingface.co/altomek/measurements/resolve/main/CodeRosa-AB1_measurement.json) --> ExLlamav2 measurments
92
 
 
86
  - [4bpw](https://huggingface.co/altomek/CodeRosa-70B-AB1-4bpw-EXL2)
87
  - [3.92bpw](https://huggingface.co/altomek/CodeRosa-70B-AB1-3.92bpw-EXL2) --> 40GB VRAM
88
  - [3.5bpw](https://huggingface.co/altomek/CodeRosa-70B-AB1-3.5bpw-EXL2)
89
+ - [3bpw](https://huggingface.co/altomek/CodeRosa-70B-AB1-3bpw-EXL2) --> this and below quants do not represent model full potential!
90
  - [2.4bpw](https://huggingface.co/altomek/CodeRosa-70B-AB1-2.4bpw-EXL2) --> 24GB VRAM
91
  - [measurements](https://huggingface.co/altomek/measurements/resolve/main/CodeRosa-AB1_measurement.json) --> ExLlamav2 measurments
92