Bidirectional Encoder Representations from Transformers
Bidirectional Encoder Representations from Transformers is an AI model published by Google Research, released on 2018-10-11, for Foundation model, with 3.4B parameters, and 2K tokens context length, requiring about 1.3GB storage, under the Apache 2.0 license.
Data sourced primarily from official releases (GitHub, Hugging Face, papers), then benchmark leaderboards, then third-party evaluators. Learn about our data methodology
Bidirectional Encoder Representations from Transformers is an AI model published by Google Research, released on 2018-10-11, for Foundation model, with 3.4B parameters, and 2K tokens context length, requiring about 1.3GB storage, under the Apache 2.0 license.
Follow DataLearner on WeChat for AI model updates and research notes.

No curated comparisons for this model yet.
Want a custom combination? Open the compare tool