InternLM2-Base-20B
InternLM2-Base-20B is an AI model published by 上海人工智能实验室, released on 2024-01-17, for 基础大模型, with 200.0B parameters, and 200K tokens context length, requiring about 40GB storage, under the 免费商用授权 license.
Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology
上海人工智能实验室最新开源的大语言模型,是InternLM系列模型的第二代,相比较第一代模型,上下文长度最长扩展到200K,综合性能上升明显。
InternLM2-Base-20B是其200亿参数的基座模型,与Yi-34B模型水平差不多,在MMLU评测上得分接近GPT-3.5。
| 评测数据集 | InternLM2-7B | InternLM2-Chat-7B | InternLM2-20B | InternLM2-Chat-20B | ChatGPT | GPT-4 |
|---|---|---|---|---|---|---|
| MMLU | 65.8 | 63.7 | 67.7 | 66.5 | 69.1 | 83.0 |
| AGIEval | 49.9 | 47.2 | 53.0 | 50.3 | 39.9 | 55.1 |
| BBH | 65.0 | 61.2 | 72.1 | 68.3 | 70.1 | 86.7 |
| GSM8K | 70.8 | 70.7 | 76.1 | 79.6 | 78.2 | 91.4 |
| MATH | 20.2 | 23.0 | 25.5 | 31.9 | 28.0 | 45.8 |
| HumanEval | 43.3 | 59.8 | 48.8 | 67.1 | 73.2 | 74.4 |
| MBPP(Sanitized) | 51.8 | 51.4 | 63.0 | 65.8 | 78.9 | 79.0 |
欢迎关注 DataLearner 官方微信,获得最新 AI 技术推送
