RecurrentGemma-2B
RecurrentGemma-2B is an AI model published by Google Research, released on 2024-04-09, for Foundation model, with 27.0B parameters, and 8K tokens context length, requiring about 5.37GB storage, under the Gemma Terms of Use license.
Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology
RecurrentGemma-2B is an AI model published by Google Research, released on 2024-04-09, for Foundation model, with 27.0B parameters, and 8K tokens context length, requiring about 5.37GB storage, under the Gemma Terms of Use license.
Follow DataLearner on WeChat for AI model updates and research notes.

No curated comparisons for this model yet.
Want a custom combination? Open the compare tool