Ziya-LLaMA-13B-v1
Ziya-LLaMA-13B-v1 is an AI model published by IDEA研究院, released on 2023-05-16, for 基础大模型, with 130.0B parameters, and 4K tokens context length, requiring about 26 storage, under the 开源不可商用 license.
Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology
姜子牙通用大模型V1是基于LLaMa的130亿参数的大规模预训练模型,具备翻译,编程,文本分类,信息抽取,摘要,文案生成,常识问答和数学计算等能力。目前姜子牙通用大模型已完成大规模预训练、多任务有监督微调和人类反馈学习三阶段的训练过程。
欢迎关注 DataLearner 官方微信,获得最新 AI 技术推送
