GGUF
English
Mixture of Experts
olmo
olmoe
Muennighoff commited on
Commit
d8095b8
1 Parent(s): d7c44e1

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +29 -0
README.md ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - en
5
+ tags:
6
+ - moe
7
+ - olmo
8
+ - olmoe
9
+ co2_eq_emissions: 1
10
+ datasets:
11
+ - allenai/ultrafeedback_binarized_cleaned
12
+ base_model: allenai/OLMoE-1B-7B-0924-SFT
13
+ library_name: transformers
14
+ ---
15
+
16
+ GGUF version of https://huggingface.co/allenai/OLMoE-1B-7B-0924-Instruct
17
+
18
+ ```bibtex
19
+ @misc{muennighoff2024olmoeopenmixtureofexpertslanguage,
20
+ title={OLMoE: Open Mixture-of-Experts Language Models},
21
+ author={Niklas Muennighoff and Luca Soldaini and Dirk Groeneveld and Kyle Lo and Jacob Morrison and Sewon Min and Weijia Shi and Pete Walsh and Oyvind Tafjord and Nathan Lambert and Yuling Gu and Shane Arora and Akshita Bhagia and Dustin Schwenk and David Wadden and Alexander Wettig and Binyuan Hui and Tim Dettmers and Douwe Kiela and Ali Farhadi and Noah A. Smith and Pang Wei Koh and Amanpreet Singh and Hannaneh Hajishirzi},
22
+ year={2024},
23
+ eprint={2409.02060},
24
+ archivePrefix={arXiv},
25
+ primaryClass={cs.CL},
26
+ url={https://arxiv.org/abs/2409.02060},
27
+ }
28
+ ```
29
+