DataLearner logoDataLearnerAI
Latest AI Insights
Model Evaluations
Model Directory
Model Comparison
Resource Center
Tools
LanguageEnglish

加载中...

DataLearner logoDataLearner AI

A knowledge platform focused on LLM benchmarking, datasets, and practical instruction with continuously updated capability maps.

Products

  • Leaderboards
  • Model comparison
  • Datasets

Resources

  • Tutorials
  • Editorial
  • Tool directory

Company

  • About
  • Privacy policy
  • Data methodology
  • Contact

© 2026 DataLearner AI. DataLearner curates industry data and case studies so researchers, enterprises, and developers can rely on trustworthy intelligence.

Privacy policyTerms of service
Page navigation
目录
Model catalogVicuna 7B 16K
VI

Vicuna 7B 16K

Vicuna 7B 16K

Release date: 2023-08-03更新于: 2023-08-03 13:51:09.929525
Live demoGitHubHugging FaceCompare
Parameters
70.0亿
Context length
16K
Chinese support
Not supported
Reasoning ability

Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology

Vicuna 7B 16K

Model basics

Reasoning traces
Not supported
Thinking modes
Thinking modes not supported
Context length
16K tokens
Max output length
No data
Model type
聊天大模型
Release date
2023-08-03
Model file size
13.3GB
MoE architecture
No
Total params / Active params
70.0B / N/A
Knowledge cutoff
No data
Vicuna 7B 16K

Open source & experience

Code license
Apache 2.0
Weights license
Llama 2 Community License Agreement- 免费商用授权
GitHub repo
https://github.com/lm-sys/FastChat
Hugging Face
https://huggingface.co/lmsys/vicuna-13b-v1.5-16k
Live demo
No live demo
Vicuna 7B 16K

Official resources

Paper
Vicuna 7B 16K
DataLearnerAI blog
No blog post yet
Vicuna 7B 16K

API details

API speed
No data
No public API pricing yet.
Vicuna 7B 16K

Benchmark Results

No benchmark data to show.
Vicuna 7B 16K

Publisher

LM-SYS
LM-SYS
View publisher details
Vicuna 7B 16K

Model Overview

LM-SYS开源的最新的基于LLaMA2微调的16K版本vicuna,官方宣称这是1.5版本的vicuna,支持最高16K上下文输入,参数70亿。

Foundation model

LLaMA2
LLaMA2
View details

DataLearner 官方微信

欢迎关注 DataLearner 官方微信,获得最新 AI 技术推送

DataLearner 官方微信二维码