Llama3.1-405B
Llama3.1-405B is an AI model published by Facebook AI研究实验室, released on 2024-07-23, for Foundation model, with 4050.0B parameters, and 128K tokens context length, requiring about 800GB storage, under the LLAMA 3.1 COMMUNITY LICENSE AGREEMENT license.
Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology
Llama3.1-405B currently shows benchmark results led by MMLU Pro (103 / 124, score 61.60). This page also consolidates core specs, context limits, and API pricing so you can evaluate the model from benchmark results and deployment constraints together.
Llama3.1-405B is an AI model published by Facebook AI研究实验室, released on 2024-07-23, for Foundation model, with 4050.0B parameters, and 128K tokens context length, requiring about 800GB storage, under the LLAMA 3.1 COMMUNITY LICENSE AGREEMENT license.
Follow DataLearner on WeChat for AI model updates and research notes.

No curated comparisons for this model yet.
Want a custom combination? Open the compare tool