Small BERT L-10_H-128_A-2 is
Google's language model. A small BERT text classification model with 10 layers and 128 hidden units, offering deep but narrow architecture for efficient English text classification.
Capabilities
Input1/5
β
Β·
Β·
Β·
Β·
Output1/5
β
Β·
Β·
Β·
Β·
Capabilities0/13
Β·
Β·
Β·
Β·
Β·
Β·
Β·
Β·
Β·
Β·
Β·
Β·
Β·
Versions
| Version | Released | Context | Input / 1M | Output / 1M | Status |
|---|---|---|---|---|---|
| BERT Base Uncased | β | β | β | β | Available |
| BERT 2 EN Uncased L-10 H-256 A-4 | β | β | β | β | Available |
| BERT 2 EN Uncased L-10 H-512 A-8 | β | β | β | β | Available |
| BERT 2 EN Uncased L-10 H-768 A-12 | β | β | β | β | Available |
| BERT 2 EN Uncased L-12 H-128 A-2 | β | β | β | β | Available |
| BERT 2 EN Uncased L-12 H-512 A-8 | β | β | β | β | Available |
| BERT 2 EN Uncased L-2 H-128 A-2 | β | β | β | β | Available |
| BERT 2 EN Uncased L-2 H-512 A-8 | β | β | β | β | Available |
| BERT 2 EN Uncased L-2 H-768 A-12 | β | β | β | β | Available |
| BERT Base Cased | β | β | β | β | Available |
| Small BERT L-10_H-128_A-2 | β | β | β | β | Current |