InternLM Chat 7B 8K
InternLM Chat 7B 8K is an AI model published by 上海人工智能实验室, released on 2023-06-03, for AI model, with 70.0B parameters, and 8K tokens context length, requiring about 14.5GB storage, under the 免费商用授权 license.
Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology
InternLM Chat 7B 8K is an AI model published by 上海人工智能实验室, released on 2023-06-03, for AI model, with 70.0B parameters, and 8K tokens context length, requiring about 14.5GB storage, under the 免费商用授权 license.
Follow DataLearner on WeChat for AI model updates and research notes.

No curated comparisons for this model yet.
Want a custom combination? Open the compare tool