Ziya-LLaMA-13B-Pretrain-v1
Ziya-LLaMA-13B-Pretrain-v1 is an AI model published by IDEA研究院, released on 2023-06-01, for Foundation model, with 130.0B parameters, and 4K tokens context length, requiring about 26 storage, under the Commercial use allowed license.
Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology
Ziya-LLaMA-13B-Pretrain-v1 is an AI model published by IDEA研究院, released on 2023-06-01, for Foundation model, with 130.0B parameters, and 4K tokens context length, requiring about 26 storage, under the Commercial use allowed license.
Follow DataLearner on WeChat for AI model updates and research notes.

No curated comparisons for this model yet.
Want a custom combination? Open the compare tool