ChatGLM2-6B-32K
ChatGLM2-6B-32K is an AI model published by 智谱AI, released on 2023-07-31, for AI model, with 60.0B parameters, and 32K tokens context length, requiring about 11.8GB storage, under the ChatGLM2-6B Model License license.
Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology
ChatGLM2-6B-32K is an AI model published by 智谱AI, released on 2023-07-31, for AI model, with 60.0B parameters, and 32K tokens context length, requiring about 11.8GB storage, under the ChatGLM2-6B Model License license.
Follow DataLearner on WeChat for AI model updates and research notes.

No curated comparisons for this model yet.
Want a custom combination? Open the compare tool