AI Models Comparison
Compare AI model pricing and benchmarks across providers including OpenAI, Anthropic, Google, AWS Bedrock, Azure, Mistral, and more. Filter by model capabilities like vision, function calling, and reasoning support. Find the most cost-effective model for your use case. Currently tracking 1,771 models across 99 providers.
The data is based on LiteLLM, maintained by the open-source community, and benchmark data from Artificial Analysis. The latest update occurred on February 27, 2026 at 12:00 AM UTC
Model | Provider | Input Price, $ | Output Price, $ | Price Compare | Context | Max Output | Intelligence | Coding | |
|---|---|---|---|---|---|---|---|---|---|
| Grok 4 | 3.00 | 15.00 | compare | 256K | 256K | 40.7 | 40.5 | ||
| o3 | 2.00 | 8.00 | compare | 200K | 100K | 38.4 | 38.4 | ||
| Claude Sonnet 4.5 | 3.00 | 15.00 | compare | 1.0M | 64K | 37.1 | 33.5 | ||
| Claude Opus 4.5 | 5.00 | 25.00 | compare | 200K | 64K | 35.3 | 42.9 | ||
| o4 mini | 1.10 | 4.40 | compare | 200K | 100K | 33.1 | 25.6 | ||
| Claude 4 Sonnet | 3.00 | 15.00 | compare | 200K | 64K | 33.0 | 30.6 | ||
| o1 | 15.00 | 60.00 | compare | 200K | 100K | 30.8 | 20.5 | ||
| Claude 3 7 Sonnet | 3.00 | 15.00 | compare | 200K | 64K | 30.8 | 26.7 | ||
| Gemini 2.5 Pro | 2.50 | 10.00 | compare | 1.0M | 66K | 30.3 | 46.7 | ||
| Glm 4.6 | 0.450 | 1.80 | compare | 200K | 200K | 30.2 | 30.2 | ||
| DeepSeek R1 | 0.550 | 2.19 | compare | 128K | 8K | 27.1 | 24.0 | ||
| Glm 4.5 | 0.600 | 2.20 | compare | 131K | 131K | 26.4 | 26.3 | ||
| Glm 4.5 Air | 0.200 | 1.10 | compare | 128K | 96K | 26.3 | 23.8 | ||
| GPT-4.1 | 2.00 | 8.00 | compare | 1.0M | 33K | 26.3 | 21.8 | ||
| Kimi K2 | 0.550 | 2.20 | compare | 131K | 16K | 26.3 | 22.1 | ||
| o3 mini | 1.10 | 4.40 | compare | 200K | 100K | 25.9 | 17.9 | ||
| Grok 3 | 3.00 | 15.00 | compare | 131K | 131K | 25.2 | 19.8 | ||
| Claude Opus 4.1 | 15.00 | 75.00 | compare | 200K | 32K | 23.6 | N/A | ||
| GPT-4.1 mini | 0.400 | 1.60 | compare | 1.0M | 33K | 22.2 | 18.5 | ||
| Claude 4 Opus | 15.00 | 75.00 | compare | 200K | 32K | 22.2 | N/A | ||
| Claude Haiku 4.5 | 1.00 | 5.00 | compare | 200K | 64K | 21.8 | 29.6 | ||
| Gemini 2.5 Flash | 0.300 | 2.50 | compare | 1.0M | 66K | 21.1 | 17.8 | ||
| Llama 4 Maverick | 0.200 | 0.600 | compare | 131K | 8K | 18.4 | 15.6 | ||
| Sonar Reasoning | 1.00 | 5.00 | compare | 127K | 8K | 17.9 | N/A | ||
| Gemini 2.0 Flash | 0.150 | 0.600 | compare | 1.0M | 8K | 17.6 | 13.6 | ||
| Magistral Medium | 2.00 | 5.00 | compare | 128K | 64K | 17.4 | 16.0 | ||
| DeepSeek V3 | 0.900 | 0.900 | compare | 128K | 8K | 17.1 | 16.4 | ||
| Magistral Small | 0.500 | 1.50 | compare | 128K | 64K | 16.8 | 11.1 | ||
| DeepSeek R1 Distill Llama 70B | 0.750 | 0.990 | compare | 131K | 131K | 16.0 | 11.4 | ||
| Qwen 3 235B | 0.200 | 0.600 | compare | 41K | 16K | 16.0 | 14.0 | ||
| Claude 3 5 Sonnet | 3.00 | 15.00 | compare | 200K | 8K | 15.9 | 30.2 | ||
| Sonar | 1.00 | 1.00 | compare | 127K | 8K | 15.5 | N/A | ||
| Sonar Reasoning Pro | 2.00 | 8.00 | compare | 127K | 8K | 15.2 | N/A | ||
| GPT-4.1 nano | 0.100 | 0.400 | compare | 1.0M | 33K | 14.9 | 11.2 | ||
| GPT-4o | 2.50 | 10.00 | compare | 128K | 16K | 14.8 | 16.7 | ||
| Devstral Small | 0.070 | 0.280 | compare | 128K | 128K | 14.8 | 12.1 | ||
| Gemini 2.0 Flash-Lite | 0.075 | 0.300 | compare | 1.0M | 8K | 14.7 | N/A | ||
| Command A | 2.50 | 10.00 | compare | 256K | 8K | 14.7 | 9.9 | ||
| Qwen 3 30B | 0.100 | 0.300 | compare | 41K | 16K | 14.6 | 13.3 | ||
| Llama 3.3 70B | 0.720 | 0.720 | compare | 128K | 8K | 14.5 | 10.7 | ||
| Qwen 3 32B | 0.100 | 0.300 | compare | 41K | 16K | 14.5 | N/A | ||
| Nova Pro | 0.800 | 3.20 | compare | 300K | 8K | 14.0 | 11.0 | ||
| Llama 4 Scout | 0.100 | 0.300 | compare | 131K | 8K | 13.5 | 6.7 | ||
| Llama 3.1 70B | 0.720 | 0.720 | compare | 128K | 8K | 13.1 | 10.9 | ||
| GPT-4 Turbo | 10.00 | 30.00 | compare | 128K | 4K | 12.8 | 13.1 | ||
| Nova Lite | 0.060 | 0.240 | compare | 300K | 8K | 12.8 | 5.1 | ||
| GPT-4o mini | 0.150 | 0.600 | compare | 128K | 16K | 12.6 | N/A | ||
| Claude 3 Opus | 15.00 | 75.00 | compare | 200K | 4K | 12.5 | 19.5 | ||
| Claude 3.5 Haiku | 0.800 | 4.00 | compare | 200K | 8K | 12.3 | 10.7 | ||
| Mistral Saba 24B | 0.790 | 0.790 | compare | 33K | 33K | 12.1 | N/A | ||
| Llama 3.2 90B | 0.720 | 0.720 | compare | 128K | 8K | 11.9 | N/A | ||
| Nova Micro | 0.035 | 0.140 | compare | 128K | 8K | 11.6 | 4.1 | ||
| Llama 3.1 8B | 0.050 | 0.080 | compare | 131K | 131K | 11.3 | 4.9 | ||
| Llama 3.2 11B | 0.160 | 0.160 | compare | 128K | 8K | 10.9 | 4.3 | ||
| Mistral Small | 0.100 | 0.300 | compare | 32K | 4K | 10.2 | N/A | ||
| Llama 3 70B | 0.590 | 0.790 | compare | 8K | 8K | 10.2 | 6.8 | ||
| Mistral Large | 2.00 | 6.00 | compare | 32K | 4K | 9.9 | N/A | ||
| Mixtral 8x22B Instruct | 1.20 | 1.20 | compare | 66K | 2K | 9.8 | N/A | ||
| Llama 3.2 3B | 0.150 | 0.150 | compare | 128K | 8K | 9.7 | N/A | ||
| Claude 3 Haiku | 0.250 | 1.25 | compare | 200K | 4K | 9.3 | 6.7 | ||
| Llama 3.2 1B | 0.100 | 0.100 | compare | 128K | 8K | 9.1 | 0.6 | ||
| GPT-3.5 Turbo | 0.500 | 1.50 | compare | 16K | 4K | 9.0 | 10.7 | ||
| Llama 3 8B | 0.050 | 0.080 | compare | 8K | 8K | 8.7 | 4.0 | ||
| Command R+ | 2.50 | 10.00 | compare | 128K | 4K | 8.3 | N/A | ||
| Command R | 0.150 | 0.600 | compare | 128K | 4K | 7.4 | N/A | ||
| Qwen 3 14B | 0.080 | 0.240 | compare | 41K | 16K | 7.4 | N/A | ||
| Grok 3 Mini Fast | 0.600 | 4.00 | compare | 131K | 131K | N/A | N/A | ||
| Grok 3 Mini | 0.300 | 0.500 | compare | 131K | 131K | N/A | N/A | ||
| Grok 3 Fast | 5.00 | 25.00 | compare | 131K | 131K | N/A | N/A | ||
| Grok 2 Vision | 2.00 | 10.00 | compare | 33K | 33K | N/A | N/A | ||
| Grok 2 | 2.00 | 10.00 | compare | 131K | 4K | N/A | N/A | ||
| V0 1.5 Md | 3.00 | 15.00 | compare | 128K | 33K | N/A | N/A | ||
| V0 1.0 Md | 3.00 | 15.00 | compare | 128K | 32K | N/A | N/A | ||
| Morph V3 Large | 0.900 | 1.90 | compare | 33K | 16K | N/A | N/A | ||
| Morph V3 Fast | 0.800 | 1.20 | compare | 33K | 16K | N/A | N/A | ||
| Pixtral Large | 2.00 | 6.00 | compare | 128K | 4K | N/A | N/A | ||
| Pixtral 12B | 0.150 | 0.150 | compare | 128K | 4K | N/A | N/A | ||
| Mistral Embed | 0.100 | N/A | compare | N/A | N/A | N/A | N/A | ||
| Ministral 8B | 0.100 | 0.100 | compare | 128K | 4K | N/A | N/A | ||
| Ministral 3B | 0.040 | 0.040 | compare | 128K | 4K | N/A | N/A | ||
| Codestral Embed | 0.150 | N/A | compare | N/A | N/A | N/A | N/A | ||
| Codestral | 0.300 | 0.900 | compare | 256K | 4K | N/A | N/A | ||
| Mercury Coder Small | 0.250 | 1.00 | compare | 32K | 16K | N/A | N/A | ||
| Gemma 2 9B | 0.200 | 0.200 | compare | 8K | 8K | N/A | N/A | ||
| Embed V4.0 | 0.120 | N/A | compare | N/A | N/A | N/A | N/A | ||
| Claude Opus 4.6 | 5.00 | 25.00 | compare | 200K | 64K | N/A | 47.6 | ||
| Titan Embed Text V2 | 0.020 | N/A | compare | N/A | N/A | N/A | N/A | ||
| Qwen3 Coder | 0.400 | 1.60 | compare | 262K | 67K | N/A | N/A |