🟢 : 预训练模型:这类模型是新的基础模型,它们是基于特定数据集进行预训练的。
🔶 :领域特定微调模型:这些预训练模型经过了针对特定领域数据集的进一步微调,以获得更好的性能。
💬 : 聊天模型:包括使用任务指令数据集的IFT(指令式任务训练)、RLHF(强化学习从人类反馈)或DPO(通过增加策略稍微改变模型的损失)等方法进行的聊天式微调模型。
🤝 :基础合并和Moerges模型:这类模型通过合并或MoErges(模型融合)技术集成了多个模型,但不需要额外的微调。如果您发现没有图标的模型,请随时提交问题,以补充模型信息。
❓:表示未知
| 模型名称 | 模型类型 | 参数大小(亿) | 平均分 | ARC分数 | Hellaswag分数 | MMLU分数 | TruthfulQA分数 | Winogrande分数 | GSM8K分数 | 模型架构 |
|---|---|---|---|---|---|---|---|---|---|---|
| Platypus2-70B-instruct 📑 | 🔶 |
689.8 |
69.3 |
71.84 |
87.94 |
70.48 |
62.26 |
82.72 |
40.56 |
LlamaForCausalLM |
| Yi-34B-200K-rawrr1-LORA-DPO-experimental-r3 📑 | 🔶 |
340 |
69.29 |
64.85 |
84.77 |
76.0 |
45.35 |
83.11 |
61.64 |
? |
| BeagleLake-7B-Toxic 📑 | 🔶 |
70 |
69.24 |
65.19 |
83.83 |
62.82 |
57.67 |
82.32 |
63.61 |
MistralForCausalLM |
| LaseredHermes-7B-v1 📑 | 🔶 |
72.4 |
69.2 |
66.98 |
85.22 |
63.6 |
59.01 |
78.3 |
62.09 |
MistralForCausalLM |
| Pallas-0.5-LASER-0.3 📑 | 🔶 |
343.9 |
69.17 |
64.76 |
83.17 |
74.66 |
55.43 |
80.9 |
56.1 |
LlamaForCausalLM |
| Yi-34b-200K-rawrr-v2-run-0902-LoRA 📑 | 🔶 |
340 |
69.15 |
64.68 |
84.5 |
75.76 |
46.66 |
81.14 |
62.17 |
? |
| airoboros-l2-70b-2.2.1 📑 | 💬 |
700 |
69.13 |
69.71 |
87.95 |
69.79 |
59.49 |
82.95 |
44.88 |
LlamaForCausalLM |
| LaseredHermes-7B-v1 📑 | 🔶 |
72.4 |
69.09 |
66.89 |
85.21 |
63.58 |
59.09 |
78.45 |
61.33 |
MistralForCausalLM |
| openchat-nectar-0.14 📑 | 🔶 |
72.4 |
69.09 |
65.61 |
83.02 |
64.58 |
50.09 |
82.0 |
69.22 |
MistralForCausalLM |
| Mixtral-8x7B-v0.1-top3 📑 | 🤝 |
467 |
69.09 |
67.41 |
86.63 |
71.98 |
48.58 |
82.4 |
57.54 |
MixtralForCausalLM |
| Optimus-7B 📑 | 🔶 |
72.4 |
69.09 |
65.44 |
85.41 |
63.61 |
55.79 |
78.77 |
65.5 |
MistralForCausalLM |
| loyal-piano-m7-cdpo 📑 | 💬 |
72.4 |
69.08 |
67.15 |
85.39 |
64.52 |
61.53 |
79.4 |
56.48 |
MistralForCausalLM |
| Mistral-CatMacaroni-slerp-gradient 📑 | 🔶 |
72.4 |
69.08 |
65.53 |
85.66 |
61.53 |
64.1 |
80.03 |
57.62 |
Unknown |
| Neural-una-cybertron-7b 📑 | 🔶 |
72.4 |
69.05 |
69.03 |
84.51 |
62.79 |
64.99 |
80.66 |
52.31 |
Unknown |
| orca_mini_v3_70b 📑 | 🔶 |
687.2 |
69.02 |
71.25 |
87.85 |
70.18 |
61.27 |
82.72 |
40.86 |
Unknown |
| loyal-piano-m7-cdpo 📑 | 🔶 |
72.4 |
69.0 |
67.06 |
85.42 |
64.54 |
61.54 |
79.08 |
56.33 |
MistralForCausalLM |
| servile-harpsichord-cdpo 📑 | 💬 |
72.4 |
68.98 |
67.32 |
85.18 |
64.54 |
60.61 |
79.16 |
57.09 |
MistralForCausalLM |
| LeoScorpius-GreenNode-Platypus-7B-v1 📑 | 🔶 |
70 |
68.96 |
66.04 |
86.53 |
62.06 |
52.78 |
82.16 |
64.22 |
MistralForCausalLM |
| LHK_44 📑 | 🔶 |
107.3 |
68.95 |
66.55 |
84.86 |
65.37 |
59.58 |
80.9 |
56.41 |
LlamaForCausalLM |
| MegaDolphin-120b 📑 | 🔶 |
1203.2 |
68.91 |
69.03 |
87.8 |
69.26 |
59.28 |
81.85 |
46.25 |
LlamaForCausalLM |
| openchat-3.5-1210 📑 | 🔶 |
72.4 |
68.89 |
64.93 |
84.92 |
64.62 |
52.15 |
80.74 |
65.96 |
MistralForCausalLM |
| Mixtral-8x7B-peft-v0.1 📑 | 💬 |
70 |
68.87 |
67.24 |
86.03 |
68.59 |
59.54 |
80.43 |
51.4 |
Unknown |
| FT 📑 | 🔶 |
343.9 |
68.85 |
63.05 |
82.78 |
69.69 |
59.88 |
79.64 |
58.07 |
Unknown |
| kellemar-DPO-7B-d 📑 | 🔶 |
72.4 |
68.84 |
66.89 |
85.16 |
62.77 |
56.88 |
79.32 |
62.02 |
MistralForCausalLM |
| mixtral_7bx4_moe 📑 | 🔶 |
241.5 |
68.83 |
65.27 |
85.28 |
62.84 |
59.85 |
77.66 |
62.09 |
Unknown |
| Marcoroni-70B-v1 📑 | 🔶 |
687.2 |
68.83 |
73.55 |
87.62 |
70.67 |
64.41 |
83.43 |
33.28 |
Unknown |
| FT 📑 | 🔶 |
343.9 |
68.81 |
63.14 |
82.78 |
69.5 |
59.8 |
79.4 |
58.23 |
Unknown |
| Crunchy-onion 📑 | 🔶 |
467 |
68.75 |
67.15 |
86.19 |
70.02 |
63.88 |
73.24 |
52.01 |
MixtralForCausalLM |
| LHK 📑 | 💬 |
107.3 |
68.74 |
66.38 |
84.49 |
65.13 |
59.12 |
80.98 |
56.33 |
LlamaForCausalLM |
| A11P 📑 | 🔶 |
0 |
68.73 |
62.54 |
82.53 |
70.56 |
56.44 |
79.87 |
60.42 |
Unknown |
| pic_7B_mistral_Full_v0.2 📑 | 🔶 |
70 |
68.72 |
65.36 |
84.03 |
64.51 |
59.2 |
79.48 |
59.74 |
MistralForCausalLM |
| SOLAR-10.7B-dpo-instruct-tuned-v0.1 📑 | 💬 |
107.3 |
68.68 |
65.19 |
86.09 |
66.25 |
51.81 |
83.98 |
58.76 |
LlamaForCausalLM |
| Yi-34B-AEZAKMI-v1 📑 | 💬 |
343.9 |
68.67 |
64.33 |
84.31 |
73.91 |
55.73 |
80.82 |
52.92 |
LlamaForCausalLM |
| loyal-piano-m7 📑 | 💬 |
72.4 |
68.67 |
66.72 |
85.03 |
64.43 |
60.03 |
79.08 |
56.71 |
MistralForCausalLM |
| A12P 📑 | 🔶 |
0 |
68.64 |
64.42 |
82.32 |
69.97 |
62.22 |
79.64 |
53.3 |
Unknown |
| agiin-13.6B-v0.0 📑 | 🔶 |
137.8 |
68.63 |
69.45 |
86.59 |
61.94 |
67.4 |
78.69 |
47.69 |
LlamaForCausalLM |
| spicyboros-70b-2.2 📑 | 🔶 |
700 |
68.62 |
70.73 |
87.58 |
70.32 |
58.31 |
83.82 |
40.94 |
LlamaForCausalLM |
| Maya_Hermes-2.5-Mistral-7B 📑 | 🔶 |
72.4 |
68.6 |
66.3 |
85.07 |
63.23 |
55.89 |
78.85 |
62.24 |
MistralForCausalLM |
| MixtralRPChat-ZLoss 📑 | 💬 |
467 |
68.59 |
68.6 |
86.1 |
70.44 |
53.85 |
82.0 |
50.57 |
MixtralForCausalLM |
| model_007 📑 | 🔶 |
687.2 |
68.56 |
71.08 |
87.65 |
69.04 |
63.12 |
83.35 |
37.15 |
Unknown |
| SpellBlade 📑 | 🔶 |
689.8 |
68.54 |
69.28 |
87.31 |
70.5 |
47.1 |
83.19 |
53.83 |
LlamaForCausalLM |
| model_009 📑 | 🔶 |
687.2 |
68.53 |
71.59 |
87.7 |
69.43 |
60.72 |
82.32 |
39.42 |
Unknown |
| stealth-v1.3 📑 | 🔶 |
72.4 |
68.53 |
65.19 |
84.44 |
62.7 |
59.12 |
78.61 |
61.11 |
MistralForCausalLM |
| Chupacabra-7B-v2.04 📑 | 🔶 |
72.4 |
68.52 |
66.3 |
85.7 |
60.94 |
67.76 |
78.93 |
51.48 |
MistralForCausalLM |
| Mixtral-8x7B-v0.1 ✅ 📑 | 🔶 |
467 |
68.47 |
66.38 |
86.46 |
71.88 |
46.81 |
81.69 |
57.62 |
MixtralForCausalLM |
| model_101 📑 | 🔶 |
687.2 |
68.46 |
68.69 |
86.42 |
69.92 |
58.85 |
82.08 |
44.81 |
Unknown |
| ds_diasum_md_mixtral 📑 | 💬 |
0 |
68.42 |
66.3 |
85.45 |
69.51 |
55.72 |
80.35 |
53.22 |
Unknown |
| Mixtral-8x7B-v0.1 ✅ 📑 | 🟢 |
467 |
68.42 |
66.04 |
86.49 |
71.82 |
46.78 |
81.93 |
57.47 |
MixtralForCausalLM |
| NeuralHermes-2.5-Mistral-7B-distilabel 📑 | 🔶 |
72.4 |
68.4 |
65.78 |
84.97 |
63.63 |
55.86 |
78.69 |
61.49 |
MistralForCausalLM |
| agiin-13.6B-v0.1 📑 | 💬 |
137.8 |
68.4 |
69.45 |
86.64 |
61.15 |
67.97 |
78.69 |
46.47 |
MistralForCausalLM |
| xDAN-L1-Chat-RL-v1 📑 | 🔶 |
72.4 |
68.38 |
66.3 |
85.81 |
63.21 |
56.7 |
78.85 |
59.44 |
MistralForCausalLM |
| PlatYi-34B-Llama 📑 | 💬 |
343.9 |
68.37 |
67.83 |
85.35 |
78.26 |
53.46 |
82.87 |
42.46 |
Unknown |
| kellemar-DPO-7B-v1.01 📑 | 🔶 |
72.4 |
68.32 |
65.78 |
85.04 |
63.24 |
55.54 |
78.69 |
61.64 |
MistralForCausalLM |
| PlatYi-34B-Llama-Q-FastChat 📑 | 💬 |
343.9 |
68.31 |
66.13 |
85.25 |
78.37 |
53.62 |
82.16 |
44.35 |
Unknown |
| neural-chat-7b-v3-2 📑 | 🔶 |
70 |
68.29 |
67.49 |
83.92 |
63.55 |
59.68 |
79.95 |
55.12 |
MistralForCausalLM |
| Pallas-0.5-LASER-0.4 📑 | 🔶 |
343.9 |
68.28 |
63.31 |
82.74 |
74.32 |
55.25 |
80.58 |
53.45 |
LlamaForCausalLM |
| WordWoven-13B 📑 | 🤝 |
128.8 |
68.25 |
66.13 |
85.81 |
64.06 |
54.45 |
78.93 |
60.12 |
MixtralForCausalLM |
| llama2_70b_mmlu 📑 | 🔶 |
689.8 |
68.24 |
65.61 |
87.37 |
71.89 |
49.15 |
82.4 |
52.99 |
LlamaForCausalLM |
| NeuralHermes-2.5-Mistral-7B 📑 | 🔶 |
72.4 |
68.22 |
66.55 |
84.9 |
63.32 |
54.93 |
78.3 |
61.33 |
MistralForCausalLM |
| OrionStar-Yi-34B-Chat-Llama 📑 | 🔶 |
343.9 |
68.17 |
64.93 |
84.34 |
73.67 |
53.35 |
78.85 |
53.9 |
LlamaForCausalLM |
| Sensualize-Solar-10.7B 📑 | 🔶 |
107.3 |
68.17 |
65.02 |
84.55 |
65.27 |
53.63 |
83.98 |
56.56 |
LlamaForCausalLM |
| blossom-v3_1-yi-34b 📑 | 🔶 |
340 |
68.16 |
65.36 |
84.24 |
74.37 |
56.06 |
82.08 |
46.85 |
LlamaForCausalLM |
| AZG 📑 | 🔶 |
0 |
68.16 |
62.88 |
82.02 |
70.29 |
53.84 |
79.95 |
59.97 |
Unknown |
| CapybaraHermes-2.5-Mistral-7B 📑 | 💬 |
72.4 |
68.14 |
65.78 |
85.45 |
63.13 |
56.91 |
78.3 |
59.29 |
MistralForCausalLM |
| agiin-11.1B-v0.0 📑 | 🔶 |
111.7 |
68.1 |
67.32 |
86.35 |
64.99 |
67.67 |
78.85 |
43.44 |
LlamaForCausalLM |
| PlatYi-34B-LoRA 📑 | 💬 |
343.9 |
68.1 |
67.15 |
85.37 |
78.46 |
53.32 |
83.66 |
40.64 |
LlamaForCausalLM |
| Merged-DPO-7B 📑 | 💬 |
70 |
68.06 |
68.94 |
87.75 |
55.35 |
72.76 |
78.37 |
45.19 |
Unknown |
| lil-c3po 📑 | 💬 |
72.4 |
68.03 |
65.02 |
84.45 |
62.36 |
68.73 |
79.16 |
48.45 |
Unknown |
| bagel-dpo-7b-v0.1 📑 | 🔶 |
72.4 |
67.95 |
66.72 |
84.16 |
64.24 |
64.05 |
80.9 |
47.61 |
MistralForCausalLM |
| Pallas-0.5-LASER-exp2-0.1 📑 | 🔶 |
343.9 |
67.92 |
62.97 |
82.11 |
74.66 |
55.24 |
79.79 |
52.77 |
LlamaForCausalLM |
| ThetaWave-7B-sft 📑 | 🔶 |
72.4 |
67.92 |
63.14 |
84.42 |
63.78 |
59.74 |
79.64 |
56.79 |
MistralForCausalLM |
| PlatYi-34B-Llama-Q-v2 📑 | 💬 |
343.9 |
67.88 |
61.09 |
85.09 |
76.59 |
52.65 |
82.79 |
49.05 |
LlamaForCausalLM |
| Einstein-openchat-7B 📑 | 🔶 |
72.4 |
67.87 |
65.1 |
83.57 |
64.01 |
54.51 |
79.16 |
60.88 |
MistralForCausalLM |
| OpenAGI-7B-v0.1 📑 | 💬 |
72.4 |
67.87 |
68.26 |
85.06 |
61.6 |
59.4 |
79.79 |
53.07 |
MistralForCausalLM |
| PlatYi-34B-200k-Q-FastChat 📑 | 💬 |
340 |
67.85 |
64.93 |
84.46 |
77.13 |
48.38 |
80.74 |
51.48 |
LlamaForCausalLM |
| falcon-180B 📑 | 🟢 |
1795.2 |
67.85 |
69.45 |
88.86 |
70.5 |
45.47 |
86.9 |
45.94 |
FalconForCausalLM |
| OpenHermes-2.5-neural-chat-7b-v3-1-7B 📑 | 🔶 |
72.4 |
67.84 |
66.55 |
84.47 |
63.34 |
61.22 |
78.37 |
53.07 |
MistralForCausalLM |
| Mixtral-Orca-v0.1 📑 | 💬 |
467 |
67.82 |
69.71 |
88.88 |
66.06 |
63.85 |
81.14 |
37.3 |
MixtralForCausalLM |
| SauerkrautLM-Mixtral-8x7B 📑 | 🔶 |
467 |
67.8 |
68.86 |
86.01 |
66.69 |
57.2 |
80.51 |
47.54 |
MixtralForCausalLM |
| stealth-rag-v1.1 📑 | 🔶 |
72.4 |
67.79 |
62.12 |
83.83 |
64.06 |
49.64 |
79.32 |
67.78 |
MistralForCausalLM |
| Xwin-Math-70B-V1.0 📑 | 🔶 |
700 |
67.78 |
64.51 |
84.88 |
66.2 |
51.58 |
81.53 |
58.0 |
LlamaForCausalLM |
| DistilHermes-2.5-Mistral-7B 📑 | 🔶 |
72.4 |
67.76 |
65.87 |
84.78 |
63.65 |
54.24 |
78.22 |
59.82 |
MistralForCausalLM |
| Chupacabra-7B 📑 | 🔶 |
72.4 |
67.76 |
66.81 |
83.52 |
62.68 |
52.31 |
79.08 |
62.17 |
MistralForCausalLM |
| Bumblebee-7B 📑 | 🔶 |
72.4 |
67.73 |
63.4 |
84.16 |
64.0 |
50.96 |
78.22 |
65.66 |
MistralForCausalLM |
| Voldemort-10B-DPO 📑 | 🔶 |
107.3 |
67.69 |
65.7 |
84.79 |
62.82 |
61.33 |
77.27 |
54.21 |
MistralForCausalLM |
| Voldemort-10B-DPO 📑 | 🔶 |
107.3 |
67.68 |
66.04 |
84.84 |
62.88 |
61.44 |
77.03 |
53.83 |
MistralForCausalLM |
| DPOpenHermes-7B 📑 | 🔶 |
72.4 |
67.63 |
65.96 |
85.9 |
63.98 |
56.92 |
78.22 |
54.81 |
MistralForCausalLM |
| CCK-v2.0-DPO 📑 | 🔶 |
108.6 |
67.62 |
65.87 |
86.81 |
62.1 |
69.33 |
82.16 |
39.42 |
LlamaForCausalLM |
| ORCA_LLaMA_70B_QLoRA 📑 | 🔶 |
700 |
67.6 |
72.27 |
87.74 |
70.23 |
63.37 |
83.66 |
28.35 |
LlamaForCausalLM |
| DPOpenHermes-7B 📑 | 💬 |
72.4 |
67.58 |
65.7 |
85.96 |
63.89 |
56.95 |
78.61 |
54.36 |
MistralForCausalLM |
| SeaLLM-7B-v2 📑 | 💬 |
70 |
67.57 |
62.03 |
82.32 |
61.89 |
51.11 |
79.08 |
68.99 |
MistralForCausalLM |
| MoMo-70B-LoRA-V1.1 📑 | 💬 |
700 |
67.53 |
66.64 |
87.16 |
66.76 |
54.98 |
83.35 |
46.32 |
Unknown |
| BigWeave-v6-90b 📑 | 🤝 |
878 |
67.47 |
65.36 |
87.21 |
68.04 |
57.96 |
81.69 |
44.58 |
LlamaForCausalLM |
| FashionGPT-70B-V1 📑 | 🔶 |
700 |
67.47 |
71.08 |
87.32 |
70.7 |
63.92 |
83.66 |
28.13 |
LlamaForCausalLM |
| juanako-7b-UNA 📑 | 🔶 |
72.4 |
67.46 |
68.17 |
85.34 |
62.47 |
65.13 |
78.85 |
44.81 |
MistralForCausalLM |
| UNA-dolphin-2.6-mistral-7b-dpo-laser 📑 | 🔶 |
72.4 |
67.43 |
67.15 |
86.31 |
63.36 |
64.15 |
79.24 |
44.35 |
MistralForCausalLM |
| Samantha-1.1-70b 📑 | 💬 |
687.2 |
67.43 |
68.77 |
87.46 |
68.6 |
64.85 |
83.27 |
31.61 |
Unknown |
| Moe-2x7b-QA-Code 📑 | 🔶 |
128.8 |
67.42 |
65.19 |
85.36 |
61.71 |
65.23 |
77.35 |
49.66 |
MixtralForCausalLM |
| CodeNinja-1.0-OpenChat-7B 📑 | 🔶 |
72.4 |
67.4 |
63.48 |
83.65 |
63.77 |
47.16 |
79.79 |
66.57 |
MistralForCausalLM |
| test_42_70b 📑 | 🔶 |
687.2 |
67.38 |
68.26 |
87.65 |
70.0 |
48.76 |
83.66 |
45.94 |
Unknown |
注意:手机屏幕有限,仅展示平均分,所有内容建议电脑端访问。
| 模型名称: | Platypus2-70B-instruct 📑 🔶 |
| 参数大小: |
689.8 |
| 平均分: |
69.3 |
| 模型名称: | Yi-34B-200K-rawrr1-LORA-DPO-experimental-r3 📑 🔶 |
| 参数大小: |
340 |
| 平均分: |
69.29 |
| 模型名称: | BeagleLake-7B-Toxic 📑 🔶 |
| 参数大小: |
70 |
| 平均分: |
69.24 |
| 模型名称: | LaseredHermes-7B-v1 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
69.2 |
| 模型名称: | Pallas-0.5-LASER-0.3 📑 🔶 |
| 参数大小: |
343.9 |
| 平均分: |
69.17 |
| 模型名称: | Yi-34b-200K-rawrr-v2-run-0902-LoRA 📑 🔶 |
| 参数大小: |
340 |
| 平均分: |
69.15 |
| 模型名称: | airoboros-l2-70b-2.2.1 📑 💬 |
| 参数大小: |
700 |
| 平均分: |
69.13 |
| 模型名称: | LaseredHermes-7B-v1 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
69.09 |
| 模型名称: | openchat-nectar-0.14 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
69.09 |
| 模型名称: | Mixtral-8x7B-v0.1-top3 📑 🤝 |
| 参数大小: |
467 |
| 平均分: |
69.09 |
| 模型名称: | Optimus-7B 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
69.09 |
| 模型名称: | loyal-piano-m7-cdpo 📑 💬 |
| 参数大小: |
72.4 |
| 平均分: |
69.08 |
| 模型名称: | Mistral-CatMacaroni-slerp-gradient 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
69.08 |
| 模型名称: | Neural-una-cybertron-7b 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
69.05 |
| 模型名称: | orca_mini_v3_70b 📑 🔶 |
| 参数大小: |
687.2 |
| 平均分: |
69.02 |
| 模型名称: | loyal-piano-m7-cdpo 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
69.0 |
| 模型名称: | servile-harpsichord-cdpo 📑 💬 |
| 参数大小: |
72.4 |
| 平均分: |
68.98 |
| 模型名称: | LeoScorpius-GreenNode-Platypus-7B-v1 📑 🔶 |
| 参数大小: |
70 |
| 平均分: |
68.96 |
| 模型名称: | LHK_44 📑 🔶 |
| 参数大小: |
107.3 |
| 平均分: |
68.95 |
| 模型名称: | MegaDolphin-120b 📑 🔶 |
| 参数大小: |
1203.2 |
| 平均分: |
68.91 |
| 模型名称: | openchat-3.5-1210 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
68.89 |
| 模型名称: | Mixtral-8x7B-peft-v0.1 📑 💬 |
| 参数大小: |
70 |
| 平均分: |
68.87 |
| 模型名称: | FT 📑 🔶 |
| 参数大小: |
343.9 |
| 平均分: |
68.85 |
| 模型名称: | kellemar-DPO-7B-d 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
68.84 |
| 模型名称: | mixtral_7bx4_moe 📑 🔶 |
| 参数大小: |
241.5 |
| 平均分: |
68.83 |
| 模型名称: | Marcoroni-70B-v1 📑 🔶 |
| 参数大小: |
687.2 |
| 平均分: |
68.83 |
| 模型名称: | FT 📑 🔶 |
| 参数大小: |
343.9 |
| 平均分: |
68.81 |
| 模型名称: | Crunchy-onion 📑 🔶 |
| 参数大小: |
467 |
| 平均分: |
68.75 |
| 模型名称: | LHK 📑 💬 |
| 参数大小: |
107.3 |
| 平均分: |
68.74 |
| 模型名称: | A11P 📑 🔶 |
| 参数大小: |
0 |
| 平均分: |
68.73 |
| 模型名称: | pic_7B_mistral_Full_v0.2 📑 🔶 |
| 参数大小: |
70 |
| 平均分: |
68.72 |
| 模型名称: | SOLAR-10.7B-dpo-instruct-tuned-v0.1 📑 💬 |
| 参数大小: |
107.3 |
| 平均分: |
68.68 |
| 模型名称: | Yi-34B-AEZAKMI-v1 📑 💬 |
| 参数大小: |
343.9 |
| 平均分: |
68.67 |
| 模型名称: | loyal-piano-m7 📑 💬 |
| 参数大小: |
72.4 |
| 平均分: |
68.67 |
| 模型名称: | A12P 📑 🔶 |
| 参数大小: |
0 |
| 平均分: |
68.64 |
| 模型名称: | agiin-13.6B-v0.0 📑 🔶 |
| 参数大小: |
137.8 |
| 平均分: |
68.63 |
| 模型名称: | spicyboros-70b-2.2 📑 🔶 |
| 参数大小: |
700 |
| 平均分: |
68.62 |
| 模型名称: | Maya_Hermes-2.5-Mistral-7B 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
68.6 |
| 模型名称: | MixtralRPChat-ZLoss 📑 💬 |
| 参数大小: |
467 |
| 平均分: |
68.59 |
| 模型名称: | model_007 📑 🔶 |
| 参数大小: |
687.2 |
| 平均分: |
68.56 |
| 模型名称: | SpellBlade 📑 🔶 |
| 参数大小: |
689.8 |
| 平均分: |
68.54 |
| 模型名称: | model_009 📑 🔶 |
| 参数大小: |
687.2 |
| 平均分: |
68.53 |
| 模型名称: | stealth-v1.3 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
68.53 |
| 模型名称: | Chupacabra-7B-v2.04 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
68.52 |
| 模型名称: | Mixtral-8x7B-v0.1 ✅ 📑 🔶 |
| 参数大小: |
467 |
| 平均分: |
68.47 |
| 模型名称: | model_101 📑 🔶 |
| 参数大小: |
687.2 |
| 平均分: |
68.46 |
| 模型名称: | ds_diasum_md_mixtral 📑 💬 |
| 参数大小: |
0 |
| 平均分: |
68.42 |
| 模型名称: | Mixtral-8x7B-v0.1 ✅ 📑 🟢 |
| 参数大小: |
467 |
| 平均分: |
68.42 |
| 模型名称: | NeuralHermes-2.5-Mistral-7B-distilabel 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
68.4 |
| 模型名称: | agiin-13.6B-v0.1 📑 💬 |
| 参数大小: |
137.8 |
| 平均分: |
68.4 |
| 模型名称: | xDAN-L1-Chat-RL-v1 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
68.38 |
| 模型名称: | PlatYi-34B-Llama 📑 💬 |
| 参数大小: |
343.9 |
| 平均分: |
68.37 |
| 模型名称: | kellemar-DPO-7B-v1.01 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
68.32 |
| 模型名称: | PlatYi-34B-Llama-Q-FastChat 📑 💬 |
| 参数大小: |
343.9 |
| 平均分: |
68.31 |
| 模型名称: | neural-chat-7b-v3-2 📑 🔶 |
| 参数大小: |
70 |
| 平均分: |
68.29 |
| 模型名称: | Pallas-0.5-LASER-0.4 📑 🔶 |
| 参数大小: |
343.9 |
| 平均分: |
68.28 |
| 模型名称: | WordWoven-13B 📑 🤝 |
| 参数大小: |
128.8 |
| 平均分: |
68.25 |
| 模型名称: | llama2_70b_mmlu 📑 🔶 |
| 参数大小: |
689.8 |
| 平均分: |
68.24 |
| 模型名称: | NeuralHermes-2.5-Mistral-7B 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
68.22 |
| 模型名称: | OrionStar-Yi-34B-Chat-Llama 📑 🔶 |
| 参数大小: |
343.9 |
| 平均分: |
68.17 |
| 模型名称: | Sensualize-Solar-10.7B 📑 🔶 |
| 参数大小: |
107.3 |
| 平均分: |
68.17 |
| 模型名称: | blossom-v3_1-yi-34b 📑 🔶 |
| 参数大小: |
340 |
| 平均分: |
68.16 |
| 模型名称: | AZG 📑 🔶 |
| 参数大小: |
0 |
| 平均分: |
68.16 |
| 模型名称: | CapybaraHermes-2.5-Mistral-7B 📑 💬 |
| 参数大小: |
72.4 |
| 平均分: |
68.14 |
| 模型名称: | agiin-11.1B-v0.0 📑 🔶 |
| 参数大小: |
111.7 |
| 平均分: |
68.1 |
| 模型名称: | PlatYi-34B-LoRA 📑 💬 |
| 参数大小: |
343.9 |
| 平均分: |
68.1 |
| 模型名称: | Merged-DPO-7B 📑 💬 |
| 参数大小: |
70 |
| 平均分: |
68.06 |
| 模型名称: | lil-c3po 📑 💬 |
| 参数大小: |
72.4 |
| 平均分: |
68.03 |
| 模型名称: | bagel-dpo-7b-v0.1 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
67.95 |
| 模型名称: | Pallas-0.5-LASER-exp2-0.1 📑 🔶 |
| 参数大小: |
343.9 |
| 平均分: |
67.92 |
| 模型名称: | ThetaWave-7B-sft 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
67.92 |
| 模型名称: | PlatYi-34B-Llama-Q-v2 📑 💬 |
| 参数大小: |
343.9 |
| 平均分: |
67.88 |
| 模型名称: | Einstein-openchat-7B 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
67.87 |
| 模型名称: | OpenAGI-7B-v0.1 📑 💬 |
| 参数大小: |
72.4 |
| 平均分: |
67.87 |
| 模型名称: | PlatYi-34B-200k-Q-FastChat 📑 💬 |
| 参数大小: |
340 |
| 平均分: |
67.85 |
| 模型名称: | falcon-180B 📑 🟢 |
| 参数大小: |
1795.2 |
| 平均分: |
67.85 |
| 模型名称: | OpenHermes-2.5-neural-chat-7b-v3-1-7B 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
67.84 |
| 模型名称: | Mixtral-Orca-v0.1 📑 💬 |
| 参数大小: |
467 |
| 平均分: |
67.82 |
| 模型名称: | SauerkrautLM-Mixtral-8x7B 📑 🔶 |
| 参数大小: |
467 |
| 平均分: |
67.8 |
| 模型名称: | stealth-rag-v1.1 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
67.79 |
| 模型名称: | Xwin-Math-70B-V1.0 📑 🔶 |
| 参数大小: |
700 |
| 平均分: |
67.78 |
| 模型名称: | DistilHermes-2.5-Mistral-7B 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
67.76 |
| 模型名称: | Chupacabra-7B 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
67.76 |
| 模型名称: | Bumblebee-7B 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
67.73 |
| 模型名称: | Voldemort-10B-DPO 📑 🔶 |
| 参数大小: |
107.3 |
| 平均分: |
67.69 |
| 模型名称: | Voldemort-10B-DPO 📑 🔶 |
| 参数大小: |
107.3 |
| 平均分: |
67.68 |
| 模型名称: | DPOpenHermes-7B 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
67.63 |
| 模型名称: | CCK-v2.0-DPO 📑 🔶 |
| 参数大小: |
108.6 |
| 平均分: |
67.62 |
| 模型名称: | ORCA_LLaMA_70B_QLoRA 📑 🔶 |
| 参数大小: |
700 |
| 平均分: |
67.6 |
| 模型名称: | DPOpenHermes-7B 📑 💬 |
| 参数大小: |
72.4 |
| 平均分: |
67.58 |
| 模型名称: | SeaLLM-7B-v2 📑 💬 |
| 参数大小: |
70 |
| 平均分: |
67.57 |
| 模型名称: | MoMo-70B-LoRA-V1.1 📑 💬 |
| 参数大小: |
700 |
| 平均分: |
67.53 |
| 模型名称: | BigWeave-v6-90b 📑 🤝 |
| 参数大小: |
878 |
| 平均分: |
67.47 |
| 模型名称: | FashionGPT-70B-V1 📑 🔶 |
| 参数大小: |
700 |
| 平均分: |
67.47 |
| 模型名称: | juanako-7b-UNA 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
67.46 |
| 模型名称: | UNA-dolphin-2.6-mistral-7b-dpo-laser 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
67.43 |
| 模型名称: | Samantha-1.1-70b 📑 💬 |
| 参数大小: |
687.2 |
| 平均分: |
67.43 |
| 模型名称: | Moe-2x7b-QA-Code 📑 🔶 |
| 参数大小: |
128.8 |
| 平均分: |
67.42 |
| 模型名称: | CodeNinja-1.0-OpenChat-7B 📑 🔶 |
| 参数大小: |
72.4 |
| 平均分: |
67.4 |
| 模型名称: | test_42_70b 📑 🔶 |
| 参数大小: |
687.2 |
| 平均分: |
67.38 |