加载中...
MPT-7B-8K
Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology
MPT最新的支持8K上下文长度的模型
欢迎关注 DataLearner 官方微信,获得最新 AI 技术推送