DataLearner logoDataLearnerAI
Latest AI Insights
Model Leaderboards
Benchmarks
Model Directory
Model Comparison
Resource Center
Tools
LanguageEnglish
DataLearner logoDataLearner AI

A knowledge platform focused on LLM benchmarking, datasets, and practical instruction with continuously updated capability maps.

Products

  • Leaderboards
  • Model comparison
  • Datasets

Resources

  • Tutorials
  • Editorial
  • Tool directory

Company

  • About
  • Privacy policy
  • Data methodology
  • Contact

© 2026 DataLearner AI. DataLearner curates industry data and case studies so researchers, enterprises, and developers can rely on trustworthy intelligence.

Privacy policyTerms of service
Page navigation
Page navigation
Model catalogLlama3.1-405B
LL

Llama3.1-405B

Foundation model

Llama3.1-405B

Release date: 2024-07-23Updated: 2024-07-23 23:50:55427
Live demoGitHubHugging FaceCompare
Parameters
405B
Context length
128K
Chinese support
Not supported
Reasoning ability

Llama3.1-405B is an AI model published by Facebook AI研究实验室, released on 2024-07-23, for Foundation model, with 4050.0B parameters, and 128K tokens context length, requiring about 800GB storage, under the LLAMA 3.1 COMMUNITY LICENSE AGREEMENT license.

Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology

Llama3.1-405B

Model basics

Reasoning traces
Not supported
Thinking modes
Thinking modes not supported
Context length
128K tokens
Max output length
No data
Model type
Foundation model
Release date
2024-07-23
Model file size
800GB
MoE architecture
No
Total params / Active params
405B / N/A
Knowledge cutoff
No data
Llama3.1-405B

Open source & experience

Code license
LLAMA 3.1 COMMUNITY LICENSE AGREEMENT
Weights license
LLAMA 3.1 COMMUNITY LICENSE AGREEMENT- 免费商用授权
GitHub repo
https://github.com/meta-llama/llama-models/tree/main/models/llama3_1
Hugging Face
https://huggingface.co/meta-llama/Meta-Llama-3.1-405B
Live demo
No live demo
Llama3.1-405B

Official resources

Paper
models/llama3_1/MODEL_CARD.md
DataLearnerAI blog
DataLearnerAI blog
Llama3.1-405B

API details

API speed
No data
No public API pricing yet.
Llama3.1-405B

Benchmark Results

Llama3.1-405B currently shows benchmark results led by MMLU Pro (103 / 124, score 61.60). This page also consolidates core specs, context limits, and API pricing so you can evaluate the model from benchmark results and deployment constraints together.

Thinking

General Knowledge

1 evaluations
Benchmark / mode
Score
Rank/total
MMLU Pro
Standard Mode
61.60
103 / 124
View benchmark analysisCompare with other models
Llama3.1-405B

Publisher

Facebook AI研究实验室
Facebook AI研究实验室
View publisher details
Llama3.1-405B

Model Overview

Llama3.1-405B is an AI model published by Facebook AI研究实验室, released on 2024-07-23, for Foundation model, with 4050.0B parameters, and 128K tokens context length, requiring about 800GB storage, under the LLAMA 3.1 COMMUNITY LICENSE AGREEMENT license.

DataLearner on WeChat

Follow DataLearner on WeChat for AI model updates and research notes.

DataLearner WeChat QR code

Compare with other models

No curated comparisons for this model yet.

Want a custom combination? Open the compare tool