Dolphin Mixtral 2.7 8x7B is Cognitive Computations's language model. A Dolphin 2.7 fine-tune of the Mixtral 8x7B MoE model with AWQ quantization, providing an uncensored and efficient instruction-following assistant.
cognitivecomputations-dolphin-mixtral-2-7-8x7b |
| Language |
| Active |
| Cognitive Computations |
| Text |
| Text |
| 7B |
Capabilities
Input1/5
✓
·
·
·
·
Output1/5
✓
·
·
·
·
Capabilities0/13
·
·
·
·
·
·
·
·
·
·
·
·
·
Versions
| Version | Released | Context | Input / 1M | Output / 1M | Status |
|---|---|---|---|---|---|
| Dolphin Mixtral 2.7 8x7B | — | — | — | — | Current |
| Dolphin 2.6 Mixtral 8x7B | — | 33K | — | — | Available |
| Dolphin Mixtral 2.5 8x7B | — | — | — | — | Available |
| Hermes 2 8x7B DPO | — | 33K | — | — | Available |
| MiniMax M2.1 | 1.0M | $0.290 | $0.950 | Available | |
| MiniMax M2 | 205K | $0.255 | $1.00 | Available | |
| Mixtral 8x22B Instruct | 66K | $1.20 | $1.20 | Available | |
| Mixtral 8x7B Instruct | 33K | $0.070 | $0.280 | Available | |
| Hermes 2 Mixtral 8x7B DPO | — | 33K | $0.500 | $0.500 | Available |
| KARAKURI LM 8x7B Instruct | — | — | — | — | Available |
| KARAKURI LM 8x7B Instruct | — | — | — | — | Available |