GPT-2B-001
GPT-2B-001 is an AI model published by NVIDIA, released on 2023-04-20, for Foundation model, with 20.0B parameters, and 2K tokens context length, requiring about 9.04GB storage.
Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology
GPT-2B-001 is an AI model published by NVIDIA, released on 2023-04-20, for Foundation model, with 20.0B parameters, and 2K tokens context length, requiring about 9.04GB storage.
Follow DataLearner on WeChat for AI model updates and research notes.

No curated comparisons for this model yet.
Want a custom combination? Open the compare tool