XVERSE-MoE-A4.2B
XVERSE-MoE-A4.2B is an AI model published by 元象XVERSE, released on 2024-04-01, for Foundation model, with 258.0B parameters, and 4K tokens context length, requiring about 51.5GB storage, under the Apache 2.0 license.
Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology
XVERSE-MoE-A4.2B is an AI model published by 元象XVERSE, released on 2024-04-01, for Foundation model, with 258.0B parameters, and 4K tokens context length, requiring about 51.5GB storage, under the Apache 2.0 license.
Follow DataLearner on WeChat for AI model updates and research notes.

No curated comparisons for this model yet.
Want a custom combination? Open the compare tool