Mixtral 8x22B Instruct is
Mistral AI's language model with a 66K context window and up to 2K output tokens, available from 4 providers, starting at $1.20 / 1M input and $1.20 / 1M output. The instruction-tuned variant of Mixtral 8x22B, a sparse MoE LLM fine-tuned for chat and instruction following with 39B active parameters out of 141B total.
mistral-mixtral-8x22b-instruct |
| Language |
| Active |
| 66K tokens |
| 2K tokens |
| Text |
| Text |
| 22B |
| · 2 years ago |
9.8#273 |
0.5#210 |
0.3#276 |
0.0#255 |
0.1#216 |
0.0#151 |
0.00s#98 |
0.2#231 |
0.5#124 |
0.0#304 |
Capabilities
Input1/5
✓
·
·
·
·
Output1/5
✓
·
·
·
·
Capabilities2/13
·
·
✓
·
✓
·
·
·
·
·
·
·
·
Pricing by Provider
| Provider | Standard | ||
|---|---|---|---|
| Input $ / 1M | Output $ / 1M | Cache Read $ / 1M | |
Fireworks AI | $1.20 | $1.20 | N/A |
Nscale | $1.20 | $1.20 | N/A |
Vercel AI Gateway | $1.20 | $1.20 | N/A |
OpenRouter | $2.00 | $6.00 | $0.200 |
Cost Calculator
Preset:
Compares every provider & tier in USD
Versions
| Version | Released | Context | Input / 1M | Output / 1M | Status |
|---|---|---|---|---|---|
| Dolphin Mixtral 2.7 8x7B | — | — | — | — | Available |
| Dolphin 2.6 Mixtral 8x7B | — | 33K | — | — | Available |
| Dolphin Mixtral 2.5 8x7B | — | — | — | — | Available |
| Hermes 2 8x7B DPO | — | 33K | — | — | Available |
| MiniMax M2.1 | 1.0M | $0.290 | $0.950 | Available | |
| MiniMax M2 | 205K | $0.255 | $1.00 | Available | |
| Mixtral 8x22B Instruct | 66K | $1.20 | $1.20 | Current | |
| Mixtral 8x7B Instruct | 33K | $0.070 | $0.280 | Available | |
| Hermes 2 Mixtral 8x7B DPO | — | 33K | $0.500 | $0.500 | Available |
| KARAKURI LM 8x7B Instruct | — | — | — | — | Available |
| KARAKURI LM 8x7B Instruct | — | — | — | — | Available |
HuggingFace
748 likes27,016 downloads/month6,048,572 total downloads