MPT-7B-8k-Chat
MPT-7B-8k-Chat is an AI model published by MosaicML, released on 2023-07-18, for Foundation model, with 70.0B parameters, and 8K tokens context length, requiring about 13.3GB storage, under the CC BY-NC-SA 4.0 license.
Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology
MPT-7B-8k-Chat is an AI model published by MosaicML, released on 2023-07-18, for Foundation model, with 70.0B parameters, and 8K tokens context length, requiring about 13.3GB storage, under the CC BY-NC-SA 4.0 license.
Follow DataLearner on WeChat for AI model updates and research notes.

No curated comparisons for this model yet.
Want a custom combination? Open the compare tool