OLMo 2 32B is
Allen AI's language model. A large 32B open language model from the Allen Institute for AI, designed to advance the science of language models through full transparency of training data and methodology.
10.6#371 |
2.7#358 |
3.3#245 |
0.5#271 |
0.3#389 |
0.0#403 |
0.1#308 |
0.4#236 |
0.00s#20 |
0.1#399 |
0.0#245 |
0.0#339 |
0.0#336 |
0.0#355 |
0.0#299 |
Capabilities
Input1/5
✓
·
·
·
·
Output1/5
✓
·
·
·
·
Capabilities0/13
·
·
·
·
·
·
·
·
·
·
·
·
·
Versions
| Version | Released | Context | Input / 1M | Output / 1M | Status |
|---|---|---|---|---|---|
| OLMo 3.1 32B Instruct | 66K | — | — | Available | |
| OLMo 3 32B Think | 66K | $0.150 | $0.500 | Available | |
| OLMo 3 7B Instruct | — | 33K | — | — | Available |
| OLMo 3 7B Think | — | 33K | — | — | Available |
| OLMo 3.1 32B Think | — | — | — | — | Available |
| OLMo 2 32B | — | — | — | — | Current |
| OLMo 2 7B | — | — | — | — | Available |