DataLearner logoDataLearnerAI
Latest AI Insights
Model Leaderboards
Benchmarks
Model Directory
Model Comparison
Resource Center
Tools
LanguageEnglish
DataLearner logoDataLearner AI

A knowledge platform focused on LLM benchmarking, datasets, and practical instruction with continuously updated capability maps.

Products

  • Leaderboards
  • Model comparison
  • Datasets

Resources

  • Tutorials
  • Editorial
  • Tool directory

Company

  • About
  • Privacy policy
  • Data methodology
  • Contact

© 2026 DataLearner AI. DataLearner curates industry data and case studies so researchers, enterprises, and developers can rely on trustworthy intelligence.

Privacy policyTerms of service
Page navigation
Page navigation
Model catalogGemma 4 120B
GE

Gemma 4 120B

RumoredFoundation model

Gemma 4 120B

Release date: 2026-05-19812
Live demoGitHubHugging FaceCompare
Parameters
120B
Context length
128K
Chinese support
Not supported
Reasoning ability

Gemma 4 120B is an AI model published by Google Deep Mind, released on 2026-05-19, for Foundation model, with 1200.0B parameters, and 128K tokens context length, under the Gemma Terms of Use license.

Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology

Gemma 4 120B

Model basics

Reasoning traces
Not supported
Thinking modes
Thinking modes not supported
Context length
128K tokens
Max output length
8192 tokens
Model type
Foundation model
Release date
2026-05-19
Model file size
No data
MoE architecture
Yes
Total params / Active params
120B / No data
Knowledge cutoff
No data
Gemma 4 120B

Open source & experience

Code license
Gemma Terms of Use
Weights license
Gemma Terms of Use- 免费商用授权
GitHub repo
GitHub link unavailable
Hugging Face
Hugging Face link unavailable
Live demo
No live demo
Gemma 4 120B

Official resources

Paper
No paper available
DataLearnerAI blog
No blog post yet
Gemma 4 120B

API details

API speed
3/5
No public API pricing yet.
Gemma 4 120B

Benchmark Results

No benchmark data to show.

Compare with other models

No curated comparisons for this model yet.

Want a custom combination? Open the compare tool

Gemma 4 120B

Publisher

Google Deep Mind
Google Deep Mind
View publisher details
Gemma 4 120B

Model Overview

Gemma 4 120B is an AI model published by Google Deep Mind, released on 2026-05-19, for Foundation model, with 1200.0B parameters, and 128K tokens context length, under the Gemma Terms of Use license.

DataLearner on WeChat

Follow DataLearner on WeChat for AI model updates and research notes.

DataLearner WeChat QR code