Dolphin 2.7 Mixtral 8x7B is Cognitive Computations's language model. An AWQ-quantized fine-tune of Mixtral 8x7B optimized for efficient instruction-following and conversational tasks.
cognitive-computations-dolphin-mixtral-2-7-8x7b |
| Language |
| Active |
| Cognitive Computations |
| Text |
| Text |
| 7B |
Capabilities
Input1/5
✓
·
·
·
·
Output1/5
✓
·
·
·
·
Capabilities0/13
·
·
·
·
·
·
·
·
·
·
·
·
·
Versions
| Version | Released | Context | Input / 1M | Output / 1M | Status |
|---|---|---|---|---|---|
| Dolphin 2.7 Mixtral 8x7B | — | — | — | — | Current |
| Dolphin 2.5 Mixtral 8x7B | — | — | — | — | Available |
| Dolphin 2.6 Mixtral 8x7B | — | 33K | $0.500 | $0.500 | Available |
| MiniMax M2 | 205K | $0.255 | $1.00 | Available | |
| Mixtral 8x22B Instruct | 66K | $0.600 | $0.600 | Available | |
| Mixtral 8x7B Instruct | 33K | $0.070 | $0.150 | Available | |
| Hermes 2 Mixtral 8x7B DPO | — | 33K | $0.500 | $0.500 | Available |
| KARAKURI LM 8x7B Instruct | — | — | — | — | Available |
| Mixtral | — | — | — | — | Available |
| Mixtral 8x22B | — | 66K | $1.20 | $1.20 | Available |
| Mixtral 8x7B | — | 33K | $0.460 | $0.500 | Available |