Dolphin Mixtral 2.7 8x7B


Dolphin Mixtral 2.7 8x7B is Cognitive Computations's language model. A Dolphin 2.7 fine-tune of the Mixtral 8x7B MoE model with AWQ quantization, providing an uncensored and efficient instruction-following assistant.
Spec
Canonical IDcognitivecomputations-dolphin-mixtral-2-7-8x7b
TypeLanguage
StatusActive
CreatorCognitive Computations
Input ModalitiesText
Output ModalitiesText
Parameters7B

Capabilities

Input1/5
Text
Image·
Audio·
Video·
PDF·
Output1/5
Text
Image·
Audio·
Video·
Embedding·
Capabilities0/13
Reasoning·
Adaptive Reasoning·
Function Calling·
Parallel Function Calling·
Structured Outputs·
Native JSON Schema·
Web Search·
URL Context·
Computer Use·
Code Execution·
File Search·
Prompt Caching·
Assistant Prefill·

Versions

VersionReleasedContextInput / 1MOutput / 1MStatus
Dolphin Mixtral 2.7 8x7BCurrent
Dolphin 2.6 Mixtral 8x7B33KAvailable
Dolphin Mixtral 2.5 8x7BAvailable
Hermes 2 8x7B DPO33KAvailable
MiniMax M2.11.0M$0.290$0.950Available
MiniMax M2205K$0.255$1.00Available
Mixtral 8x22B Instruct66K$1.20$1.20Available
Mixtral 8x7B Instruct33K$0.070$0.280Available
Hermes 2 Mixtral 8x7B DPO33K$0.500$0.500Available
KARAKURI LM 8x7B InstructAvailable
KARAKURI LM 8x7B InstructAvailable

Model IDs