DataLearner logoDataLearnerAI
Latest AI Insights
Model Evaluations
Model Directory
Model Comparison
Resource Center
Tools
LanguageEnglish

加载中...

DataLearner logoDataLearner AI

A knowledge platform focused on LLM benchmarking, datasets, and practical instruction with continuously updated capability maps.

Products

  • Leaderboards
  • Model comparison
  • Datasets

Resources

  • Tutorials
  • Editorial
  • Tool directory

Company

  • About
  • Privacy policy
  • Data methodology
  • Contact

© 2026 DataLearner AI. DataLearner curates industry data and case studies so researchers, enterprises, and developers can rely on trustworthy intelligence.

Privacy policyTerms of service
Page navigation
目录
Model catalogBaichuan 13B - Chat
BA

Baichuan 13B - Chat

Baichuan 13B - Chat

Release date: 2023-07-08更新于: 2023-08-14 10:12:48.747529
Live demoGitHubHugging FaceCompare
Parameters
130.0亿
Context length
4K
Chinese support
Supported
Reasoning ability

Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology

Baichuan 13B - Chat

Model basics

Reasoning traces
Not supported
Thinking modes
Thinking modes not supported
Context length
4K tokens
Max output length
No data
Model type
聊天大模型
Release date
2023-07-08
Model file size
26.6GB
MoE architecture
No
Total params / Active params
130.0B / N/A
Knowledge cutoff
No data
Baichuan 13B - Chat

Open source & experience

Code license
Apache 2.0
Weights license
免费商用授权- 免费商用授权
GitHub repo
https://github.com/baichuan-inc/Baichuan-13B
Hugging Face
https://huggingface.co/baichuan-inc/Baichuan-13B-Chat
Live demo
No live demo
Baichuan 13B - Chat

Official resources

Paper
No paper available
DataLearnerAI blog
No blog post yet
Baichuan 13B - Chat

API details

API speed
No data
No public API pricing yet.
Baichuan 13B - Chat

Benchmark Results

No benchmark data to show.
Baichuan 13B - Chat

Publisher

百川智能
百川智能
View publisher details
Baichuan 13B - Chat

Model Overview

Baichuan-13B-Chat为Baichuan-13B系列模型中对齐后的版本,预训练模型可见Baichuan-13B-Base。

Baichuan-13B 是由百川智能继 Baichuan-7B 之后开发的包含 130 亿参数的开源可商用的大规模语言模型,在权威的中文和英文 benchmark 上均取得同尺寸最好的效果。本次发布包含有预训练 (Baichuan-13B-Base) 和对齐 (Baichuan-13B-Chat) 两个版本。Baichuan-13B 有如下几个特点:

  1. 更大尺寸、更多数据:Baichuan-13B 在 Baichuan-7B 的基础上进一步扩大参数量到 130 亿,并且在高质量的语料上训练了 1.4 万亿 tokens,超过 LLaMA-13B 40%,是当前开源 13B 尺寸下训练数据量最多的模型。支持中英双语,使用 ALiBi 位置编码,上下文窗口长度为 4096。
  2. 同时开源预训练和对齐模型:预训练模型是适用开发者的“基座”,而广大普通用户对有对话功能的对齐模型具有更强的需求。因此本次开源我们同时发布了对齐模型(Baichuan-13B-Chat),具有很强的对话能力,开箱即用,几行代码即可简单的部署。
  3. 更高效的推理:为了支持更广大用户的使用,我们本次同时开源了 int8 和 int4 的量化版本,相对非量化版本在几乎没有效果损失的情况下大大降低了部署的机器资源门槛,可以部署在如 Nvidia 3090 这样的消费级显卡上。
  4. 开源免费可商用:Baichuan-13B 不仅对学术研究完全开放,开发者也仅需邮件申请并获得官方商用许可后,即可以免费商用。

DataLearner 官方微信

欢迎关注 DataLearner 官方微信,获得最新 AI 技术推送

DataLearner 官方微信二维码