AI Model Comparison
Compare AI model pricing and benchmarks across providers including OpenAI, Anthropic, Google, Amazon Bedrock, Azure, Mistral, and more. Filter by model capabilities like vision, function calling, and reasoning support. Find the most cost-effective model for your use case. Currently tracking 2,529 models across 98 providers. Last update:
Model | Creator | Input Price, $ | Output Price, $ | Inference Providers | Context | Max Output | Intelligence | Coding | |
|---|---|---|---|---|---|---|---|---|---|
| Claude Opus 4.7 | 5.00 | 25.00 | compare (7) | 1.0M | 128K | 57.3#1 | 52.5#1 | ||
| Claude Opus 4.6 | 5.00 | 25.00 | compare (7) | 1.0M | 128K | 46.5#2 | 47.6#2 | ||
| Claude Opus 4.5 | 5.00 | 25.00 | compare (9) | 410K | 64K | 43.1#3 | 42.9#4 | ||
| Claude Sonnet 4.6 | 3.00 | 15.00 | compare (7) | 1.0M | 128K | 42.6#4 | 43.0#3 | ||
| Claude Sonnet 4.5 | 3.00 | 15.00 | compare (10) | 1.0M | 64K | 37.1#5 | 33.5#8 | ||
| GPT-5.4 | 2.50 | 15.00 | compare (5) | 1.1M | 128K | 35.4#6 | 41.0#5 | ||
| GPT-5.2 | 1.75 | 14.00 | compare (7) | 410K | 128K | 33.6#7 | 34.7#6 | ||
| o4 Mini | 1.00 | 4.00 | compare (6) | 200K | 100K | 33.1#8 | 25.6#13 | ||
| Claude Sonnet 4 | 3.00 | 15.00 | compare (10) | 1.0M | 64K | 33.0#10 | 30.6#9 | ||
| Claude Opus 4 | 15.00 | 75.00 | compare (9) | 410K | 32K | 33.0#9 | 34.0#7 | ||
| Claude Haiku 4.5 | 1.00 | 5.00 | compare (9) | 200K | 64K | 31.1#11 | 29.6#10 | ||
| Claude Sonnet 3.7 | 3.00 | 15.00 | compare (10) | 200K | 128K | 30.8#12 | 26.7#12 | ||
| GPT-5.1 | 1.25 | 10.00 | compare (7) | 410K | 128K | 27.4#13 | 27.3#11 | ||
| GPT-4.1 | 2.00 | 8.00 | compare (7) | 1.0M | 33K | 26.3#14 | 21.8#15 | ||
| GPT OSS 120B | 0.039 | 0.190 | compare (17) | 131K | 131K | 24.5#15 | 15.5#19 | ||
| GPT-5 | 1.25 | 10.00 | compare (9) | 410K | 128K | 21.8#16 | 21.2#16 | ||
| GPT OSS 20B | 0.030 | 0.140 | compare (13) | 131K | 131K | 20.8#17 | 14.4#20 | ||
| GPT-5 Mini | 0.250 | 2.00 | compare (8) | 400K | 128K | 20.7#18 | 21.9#14 | ||
| Llama 4 Maverick | 0.150 | 0.600 | compare (4) | 1.0M | 16K | 18.4#19 | 15.6#18 | ||
| Gemini 2.5 Flash | 0.150 | 0.600 | compare (9) | 1.0M | 66K | 17.8#20 | 17.8#17 | ||
| DeepSeek R1 | 0.280 | 0.400 | compare (14) | 164K | 66K | 16.4#21 | 7.8#23 | ||
| Mistral Large 2 | 2.00 | 6.00 | compare (1) | 128K | 8K | 15.1#22 | 13.8#22 | ||
| Pixtral Large | 2.00 | 6.00 | compare (4) | 131K | 4K | 14.0#23 | N/A | ||
| GPT-5 Nano | 0.050 | 0.400 | compare (7) | 400K | 128K | 13.8#24 | 14.2#21 | ||
| Llama 4 Scout | 0.080 | 0.300 | compare (3) | 328K | 16K | 13.5#25 | 6.7#25 | ||
| Gemini 2.5 Flash Lite | 0.075 | 0.300 | compare (6) | 1.0M | 66K | 12.7#26 | 7.4#24 | ||
| Reka Flash | 0.900 | 0.900 | compare (1) | 100K | 8K | 12.0#27 | N/A | ||
| Jamba 1.5 Large | 2.00 | 2.80 | compare (4) | 256K | 8K | 10.7#28 | N/A | ||
| Mistral Large | 2.00 | 6.00 | compare (7) | 262K | 16K | 9.9#29 | N/A | ||
| Llama 2 70B Chat | 0.500 | 0.900 | compare (6) | 4K | 4K | 8.4#30 | N/A | ||
| Jamba 1.5 Mini | 0.200 | 0.200 | compare (4) | 256K | 8K | 8.0#31 | N/A | ||
| Arctic | 1.68 | 1.68 | compare (1) | 4K | N/A | N/A | N/A | ||
| Arctic Embed 1.5 M | 0.060 | 0.060 | compare (1) | N/A | N/A | N/A | N/A | ||
| Arctic Embed 2 L | 0.100 | 0.100 | compare (1) | N/A | N/A | N/A | N/A | ||
| Arctic Embed M | 0.060 | 0.060 | compare (1) | N/A | N/A | N/A | N/A | ||
| Arctic Extract | 10.00 | 10.00 | compare (1) | N/A | N/A | N/A | N/A | ||
| Arctic Tilt Entity | 19.00 | 19.00 | compare (1) | N/A | N/A | N/A | N/A | ||
| Arctic Tilt Table | 56.80 | 56.80 | compare (1) | N/A | N/A | N/A | N/A | ||
| E5 2 Base | 0.060 | 0.060 | compare (1) | N/A | N/A | N/A | N/A | ||
| Gemini 3.1 Pro | 2.20 | 13.20 | compare (1) | N/A | N/A | N/A | N/A | ||
| Gemma 7B | 0.200 | 0.200 | compare (2) | 8K | 8K | N/A | N/A | ||
| Jamba Instruct | 0.500 | 0.700 | compare (3) | 256K | 8K | N/A | N/A | ||
| Llama 3 70B | 0.590 | 0.790 | compare (3) | 8K | 8K | N/A | N/A | ||
| Llama 3 8B | 0.050 | 0.080 | compare (4) | 8K | 8K | N/A | N/A | ||
| Llama 3.1 405B | 2.40 | 2.40 | compare (1) | 128K | 8K | N/A | N/A | ||
| Llama 3.1 70B | 0.600 | 0.600 | compare (3) | 128K | 8K | N/A | N/A | ||
| Llama 3.1 8B | 0.100 | 0.100 | compare (4) | 131K | 8K | N/A | N/A | ||
| Llama 3.2 1B | 0.100 | 0.100 | compare (3) | 131K | 8K | N/A | N/A | ||
| Llama 3.2 3B | 0.100 | 0.100 | compare (3) | 131K | 8K | N/A | N/A | ||
| Llama 3.3 70B Instruct | 0.720 | 0.720 | compare (4) | 131K | 8K | N/A | N/A | ||
| Mistral 7B | 0.050 | 0.200 | compare (4) | 33K | 8K | N/A | N/A | ||
| Mixtral 8x7B | 0.460 | 0.500 | compare (3) | 33K | 8K | N/A | N/A | ||
| Multilingual E5 Large | 0.100 | 0.100 | compare (1) | N/A | N/A | N/A | N/A | ||
| NV Embed QA 4 | 0.100 | 0.100 | compare (1) | N/A | N/A | N/A | N/A | ||
| Reka Core | 11.00 | 11.00 | compare (1) | 32K | 8K | N/A | N/A | ||
| Voyage Multilingual 2 | 0.140 | 0.140 | compare (1) | N/A | N/A | N/A | N/A | ||
| Voyage Multimodal 3 | 0.120 | 0.120 | compare (2) | 32K | N/A | N/A | N/A |