Generative Pre-trained Transformer 2
Generative Pre-trained Transformer 2 is an AI model published by OpenAI, released on 2019-02-14, for Foundation model, with 15.0B parameters, and 2K tokens context length, under the Modified MIT License license.
Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology
Generative Pre-trained Transformer 2 is an AI model published by OpenAI, released on 2019-02-14, for Foundation model, with 15.0B parameters, and 2K tokens context length, under the Modified MIT License license.
Follow DataLearner on WeChat for AI model updates and research notes.

No curated comparisons for this model yet.
Want a custom combination? Open the compare tool