加载中...
加载中...
PaLM-Coder
Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology
谷歌家的编程大模型,基于PaLM进行微调得到的。参数5400亿。
欢迎关注 DataLearner 官方微信,获得最新 AI 技术推送
