Mistral Large 2407

Mistral Large 2407 is a text model from Mistral AI with a context window of 128K tokens and max output of 128K tokens. Pricing starts at 3.00 per million input tokens and 9.00 per million output tokens (cheapest at Ollama).

Capabilities

Vision Function Calling Reasoning JSON Schema System Messages Web Search Prompt Caching Audio Input Audio Output

Specifications

Model Keymistral/mistral-large-2407
ProviderMistral AI
Provider IDmistral
ModeText
Canonical Namemistral-large-2407
Context Window128K tokens
Max Output128K tokens

Pricing

TypePer 1K TokensPer 1M Tokens
Input Tokens0.00303.00
Output Tokens0.00909.00

Benchmarks

Intelligence Index13.0#152
Math Index0.0#140
MMLU-Pro0.7#121
GPQA0.5#144
HLE0.0#216
LiveCodeBench0.3#125
AIME0.1#88
IFBench0.3#130
Time to First Token0.00s#1
SciCode0.3#120
MATH-5000.7#89
AIME 20250.0#140
LCR0.0#149
TAU20.3#78

Price Comparison by Provider

Compare prices for Mistral Large 2407 across different providers. The same model may be available through multiple providers at different price points.

Provider
Model Key
Input Price, $
Output Price, $
Vertex AI (Mistral)vertex_ai/mistral-large@24072.006.00
Ollamaollama/mistral-large-instruct-2407N/AN/A
Mistral AImistral/mistral-large-24073.009.00
AWS Bedrockmistral.mistral-large-2407-v1:03.009.00
Azure AIazure_ai/mistral-large-24072.006.00

All Variants

All available versions, regions, and API endpoints for Mistral Large 2407.

Model Key
Provider
Mode
Input Price, $
Output Price, $
Context
Max Output
Vision
Functions
mistral.mistral-large-2407-v1:0AWS BedrockText3.009.00128K8Knoyes
azure_ai/mistral-large-2407Azure AIText2.006.00128K4Knoyes
mistral/mistral-large-2407Mistral AIText3.009.00128K128Knoyes
ollama/mistral-large-instruct-2407OllamaTextN/AN/A66K8Knoyes
vertex_ai/mistral-large@2407Vertex AI (Mistral)Text2.006.00128K8Knoyes