GLM4-MoE-100B-A10B
GLM4-MoE-100B-A10B is an AI model published by 智谱AI, released on 2025-07-14, for AI model, with 1000.0B parameters, and 128K tokens context length, requiring about 200GB storage, under the Apache 2.0 license.
Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology
GLM4-MoE-100B-A10B is an AI model published by 智谱AI, released on 2025-07-14, for AI model, with 1000.0B parameters, and 128K tokens context length, requiring about 200GB storage, under the Apache 2.0 license.
Follow DataLearner on WeChat for AI model updates and research notes.

No curated comparisons for this model yet.
Want a custom combination? Open the compare tool