DataLearner logoDataLearnerAI
Latest AI Insights
Model Leaderboards
Benchmarks
Model Directory
Model Comparison
Resource Center
Tools
LanguageEnglish
DataLearner logoDataLearner AI

A knowledge platform focused on LLM benchmarking, datasets, and practical instruction with continuously updated capability maps.

Products

  • Leaderboards
  • Model comparison
  • Datasets

Resources

  • Tutorials
  • Editorial
  • Tool directory

Company

  • About
  • Privacy policy
  • Data methodology
  • Contact

© 2026 DataLearner AI. DataLearner curates industry data and case studies so researchers, enterprises, and developers can rely on trustworthy intelligence.

Privacy policyTerms of service
Page navigation
目录
Model catalogMPT-7B-Base
MP

MPT-7B-Base

基础大模型

MosaicML Pretrained Transformer - 7B Base

Release date: 2023-05-05更新于: 2023-06-23 20:19:43.390150
Live demoGitHubHugging FaceCompare
Parameters
67.0亿
Context length
2K
Chinese support
Not supported
Reasoning ability

MosaicML Pretrained Transformer - 7B Base is an AI model published by MosaicML, released on 2023-05-05, for 基础大模型, with 67.0B parameters, and 2K tokens context length, requiring about 13.3GB storage, under the Apache 2.0 license.

Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology

MPT-7B-Base

Model basics

Reasoning traces
Not supported
Thinking modes
Thinking modes not supported
Context length
2K tokens
Max output length
No data
Model type
基础大模型
Release date
2023-05-05
Model file size
13.3GB
MoE architecture
No
Total params / Active params
67.0B / N/A
Knowledge cutoff
No data
MPT-7B-Base

Open source & experience

Code license
Apache 2.0
Weights license
Apache 2.0- 免费商用授权
GitHub repo
https://github.com/mosaicml/llm-foundry
Hugging Face
https://huggingface.co/mosaicml/mpt-7b
Live demo
No live demo
MPT-7B-Base

Official resources

Paper
Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs
DataLearnerAI blog
No blog post yet
MPT-7B-Base

API details

API speed
No data
No public API pricing yet.
MPT-7B-Base

Benchmark Results

No benchmark data to show.
MPT-7B-Base

Publisher

MosaicML
MosaicML
View publisher details
MosaicML Pretrained Transformer - 7B Base

Model Overview

MPT-7B是由MosaicML推出的transformer系列大模型,是基于1万亿tokens的文本和代码训练的。这是一个完全开源且允许商用的大模型,质量与LLaMA-7B差不多。


MPT系列介绍: https://www.datalearner.com/ai-models/foundation-models/MPT 


MPT-7B-Base是其中的基础模型,是一个decoder-style transformer,参数为67亿。它是在MosaicML的数据团队收集的1万亿文本和代码数据集上进行训练的。这个基本模型使用了用于快速训练和推理的FlashAttention,以及用于微调和外推到长上下文长度的ALiBi。

Foundation model

MPT
MPT
View details

DataLearner 官方微信

欢迎关注 DataLearner 官方微信,获得最新 AI 技术推送

DataLearner 官方微信二维码