OLMo 2 32B is
Allen AI's language model. A 32-billion-parameter open base language model from Allen AI's OLMo-2 series, designed for transparency and reproducibility in large-scale language modeling research.
10.6#259 |
2.7#253 |
3.3#187 |
0.5#215 |
0.3#279 |
0.0#288 |
0.1#248 |
0.4#145 |
0.00s#16 |
0.1#287 |
0.0#187 |
0.0#233 |
0.0#232 |
0.0#247 |
0.0#216 |
Capabilities
Input1/5
✓
·
·
·
·
Output1/5
✓
·
·
·
·
Capabilities0/13
·
·
·
·
·
·
·
·
·
·
·
·
·
Versions
| Version | Released | Context | Input / 1M | Output / 1M | Status |
|---|---|---|---|---|---|
| Olmo 3.1 32B Instruct | 66K | $0.200 | $0.600 | Available | |
| Olmo 3 32B Think | 66K | $0.150 | $0.500 | Available | |
| OLMo 3 7B Instruct | — | 33K | — | — | Available |
| Olmo 3 7B Think | — | 33K | — | — | Available |
| Olmo 2 32B Instruct | 128K | — | — | Available | |
| OLMo 2 32B | — | — | — | — | Current |
| OLMo 2 7B | — | — | — | — | Available |