AI Models Comparison
Compare AI model pricing and benchmarks across providers including OpenAI, Anthropic, Google, AWS Bedrock, Azure, Mistral, and more. Filter by model capabilities like vision, function calling, and reasoning support. Find the most cost-effective model for your use case. Currently tracking 1,909 models across 103 providers.
The data is based on LiteLLM, maintained by the open-source community, and benchmark data from Artificial Analysis. The latest update occurred on April 15, 2026 at 12:00 AM UTC
Model | Provider | Input Price, $ | Output Price, $ | Price Compare | Context | Max Output | Intelligence | Coding | |
|---|---|---|---|---|---|---|---|---|---|
| Zai.glm 5 | 1.00 | 3.20 | compare | 200K | 128K | 49.8#5 | 44.2#11 | ||
| Moonshotai.kimi K2.5 | 0.600 | 3.00 | compare | 262K | 262K | 46.8#11 | 39.5#18 | ||
| Claude Opus 4.6 | 5.00 | 25.00 | compare | 1.0M | 128K | 46.5#12 | 47.6#6 | ||
| Claude Sonnet 4.6 | 3.00 | 15.00 | compare | 1.0M | 64K | 44.4#16 | 46.4#9 | ||
| Claude Opus 4.5 | 5.00 | 25.00 | compare | 200K | 64K | 43.1#18 | 42.9#14 | ||
| Zai.glm 4.7 | 0.600 | 2.20 | compare | 200K | 128K | 42.1#20 | 36.3#28 | ||
| Minimax.minimax M2.5 | 0.360 | 1.44 | compare | 1.0M | 8K | 41.9#22 | 37.4#24 | ||
| Claude Sonnet 4.5 | 3.00 | 15.00 | compare | 200K | 64K | 37.1#32 | 33.5#35 | ||
| Minimax.minimax M2 | 0.300 | 1.20 | compare | 128K | 8K | 36.1#34 | 29.2#44 | ||
| Claude Opus 4.1 | 15.00 | 75.00 | compare | 200K | 32K | 36.0#35 | N/A | ||
| GPT-oss-120b | 0.150 | 0.600 | compare | 131K | 33K | 33.3#39 | 28.6#45 | ||
| Claude Sonnet 4.20250514 | 3.00 | 15.00 | compare | 1.0M | 64K | 33.0#41 | 30.6#37 | ||
| Claude Opus 4.20250514 | 15.00 | 75.00 | compare | 200K | 32K | 33.0#41 | N/A | ||
| V3.2 | 0.740 | 2.22 | compare | 164K | 164K | 32.1#43 | 34.6#33 | ||
| Claude Haiku 4.5 | 1.00 | 5.00 | compare | 200K | 64K | 31.1#46 | 29.6#43 | ||
| Eu.anthropic.claude 3 7 Sonnet 20250219 V1 | 3.00 | 15.00 | compare | 200K | 8K | 30.8#48 | 26.7#47 | ||
| Zai.glm 4.7 Flash | 0.070 | 0.400 | compare | 200K | 128K | 30.1#53 | 25.9#50 | ||
| Qwen3 Coder Next | 0.600 | 1.44 | compare | 262K | 8K | 28.3#58 | 22.9#63 | ||
| Us.deepseek.r1 V1 | 1.35 | 5.40 | compare | 128K | 4K | 27.1#60 | 24.0#57 | ||
| Qwen3 235B A22b 2507 V1 | 0.220 | 0.880 | compare | 262K | 131K | 25.0#70 | 22.1#65 | ||
| Qwen3 Coder 480B A35b V1 | 0.220 | 1.80 | compare | 262K | 66K | 24.8#71 | 24.6#56 | ||
| GPT-oss-20b | 0.075 | 0.300 | compare | 131K | 33K | 24.5#72 | 18.5#80 | ||
| Moonshotai.kimi K2 Thinking | 0.730 | 3.03 | compare | 262K | 262K | 24.1#73 | 15.5#96 | ||
| Moonshotai.kimi K2.5 | 0.720 | 3.60 | compare | 262K | 262K | 23.1#77 | 12.6#116 | ||
| Nova 2 Pro Preview 20251202 V1 | 2.19 | 17.50 | compare | 1.0M | 64K | 23.1#77 | 20.5#71 | ||
| Mistral Large 3 675B Instruct | 0.500 | 1.50 | compare | 128K | 8K | 22.8#81 | 22.7#64 | ||
| Qwen3 Vl 235B A22b | 0.530 | 2.66 | compare | 128K | 8K | 20.8#87 | 16.5#89 | ||
| Qwen3 Next 80B A3b | 0.150 | 1.20 | compare | 128K | 8K | 20.1#90 | 15.3#97 | ||
| Qwen3 Coder 30B A3b V1 | 0.150 | 0.600 | compare | 262K | 131K | 20.0#91 | 19.4#78 | ||
| Us.amazon.nova Premier V1 | 2.50 | 12.50 | compare | 1.0M | 10K | 19.0#95 | 13.8#108 | ||
| Eu.anthropic.claude 3 5 Haiku 20241022 V1 | 0.250 | 1.25 | compare | 200K | 8K | 18.7#98 | 10.7#131 | ||
| Magistral Small 2509 | 0.500 | 1.50 | compare | 128K | 8K | 18.2#102 | 14.8#98 | ||
| Claude 3 Opus 20240229 V1 | 15.00 | 75.00 | compare | 200K | 4K | 18.0#162 | 19.5#75 | ||
| Nova 2 Lite V1 | 0.300 | 2.50 | compare | 1.0M | 64K | 18.0#104 | 12.5#117 | ||
| Llama3 1 405B Instruct V1 | 5.32 | 16.00 | compare | 128K | 4K | 17.4#108 | 14.5#100 | ||
| V3 V1 | 0.580 | 1.68 | compare | 164K | 82K | 16.5#116 | 16.4#90 | ||
| Ministral 3 14B Instruct | 0.200 | 0.200 | compare | 128K | 8K | 16.0#120 | 10.9#127 | ||
| Claude 3 5 Sonnet 20240620 V1 | 3.00 | 15.00 | compare | 1.0M | 4K | 15.9#123 | 30.2#40 | ||
| Ministral 3 8B Instruct | 0.150 | 0.150 | compare | 128K | 8K | 14.8#134 | 10.0#136 | ||
| Qwen3 32B V1 | 0.150 | 0.600 | compare | 131K | 16K | 14.5#137 | N/A | ||
| Llama3 3 70B Instruct V1 | 0.720 | 0.720 | compare | 128K | 4K | 14.5#137 | 10.7#131 | ||
| Nova Pro V1 | 0.800 | 3.20 | compare | 300K | 10K | 13.5#145 | 11.0#126 | ||
| Nvidia.nemotron Nano 9B V2 | 0.060 | 0.230 | compare | 128K | 8K | 13.2#150 | 7.5#147 | ||
| Nvidia.nemotron Nano 3 30B | 0.060 | 0.240 | compare | 262K | 8K | 13.2#150 | 15.8#93 | ||
| Mistral Large 2407 V1 | 3.00 | 9.00 | compare | 128K | 8K | 13.0#153 | N/A | ||
| Nova Lite V1 | 0.060 | 0.240 | compare | 300K | 10K | 12.7#158 | 5.1#156 | ||
| Llama3 1 70B Instruct V1 | 0.990 | 0.990 | compare | 128K | 2K | 12.5#162 | 10.9#127 | ||
| Claude 3 Haiku 20240307 V1 | 0.250 | 1.25 | compare | 200K | 4K | 12.3#167 | 6.7#152 | ||
| Llama3 2 90B Instruct V1 | 2.00 | 2.00 | compare | 128K | 4K | 11.9#174 | N/A | ||
| Llama3 1 8B Instruct V1 | 0.220 | 0.220 | compare | 128K | 2K | 11.8#175 | 4.9#157 | ||
| Ministral 3 3B Instruct | 0.100 | 0.100 | compare | 128K | 8K | 11.2#176 | 4.8#158 | ||
| Jamba 1 5 Large V1 | 2.00 | 8.00 | compare | 256K | 256K | 10.7#179 | N/A | ||
| Gemma 3 27B It | 0.230 | 0.380 | compare | 128K | 8K | 10.3#185 | 9.6#139 | ||
| Claude 3 Sonnet 20240229 V1 | 3.00 | 15.00 | compare | 200K | 4K | 10.3#185 | N/A | ||
| Nova Micro V1 | 0.035 | 0.140 | compare | 128K | 10K | 10.3#185 | 4.1#160 | ||
| Nvidia.nemotron Nano 12B V2 | 0.200 | 0.600 | compare | 128K | 8K | 10.1#189 | 5.9#155 | ||
| Llama3 2 3B Instruct V1 | 0.150 | 0.150 | compare | 128K | 4K | 9.7#196 | N/A | ||
| Claude V2 | 8.00 | 24.00 | compare | 100K | 8K | 9.3#199 | 14.0#105 | ||
| Mistral Small 2402 V1 | 1.00 | 3.00 | compare | 32K | 8K | 9.0#201 | N/A | ||
| Llama3 70B Instruct V1 | 2.65 | 3.50 | compare | 8K | 8K | 8.9#204 | 6.8#151 | ||
| Gemma 3 12B It | 0.090 | 0.290 | compare | 128K | 8K | 8.8#205 | 6.3#154 | ||
| Llama3 2 11B Instruct V1 | 0.350 | 0.350 | compare | 128K | 4K | 8.7#209 | 4.3#159 | ||
| Llama2 70B Chat V1 | 1.95 | 2.56 | compare | 4K | 4K | 8.4#212 | N/A | ||
| Llama2 13B Chat V1 | 0.750 | 1.00 | compare | 4K | 4K | 8.4#212 | N/A | ||
| Command R Plus V1 | 3.00 | 15.00 | compare | 128K | 4K | 8.3#216 | N/A | ||
| Jamba 1 5 Mini V1 | 0.200 | 0.400 | compare | 256K | 256K | 8.0#221 | N/A | ||
| Command R V1 | 0.500 | 1.50 | compare | 128K | 4K | 7.4#227 | N/A | ||
| Claude Instant V1 | 0.800 | 2.40 | compare | 100K | 8K | 7.4#227 | 7.8#142 | ||
| Llama3 8B Instruct V1 | 0.300 | 0.600 | compare | 8K | 8K | 6.4#233 | 4.0#161 | ||
| Llama3 2 1B Instruct V1 | 0.100 | 0.100 | compare | 128K | 4K | 6.3#234 | 0.6#173 | ||
| Gemma 3 4B It | 0.040 | 0.080 | compare | 128K | 8K | 6.3#234 | 2.9#167 | ||
| Us.writer.palmyra X5 V1 | 0.600 | 6.00 | compare | 1.0M | 8K | N/A | N/A | ||
| Us.writer.palmyra X4 V1 | 2.50 | 10.00 | compare | 128K | 8K | N/A | N/A | ||
| GPT-oss-safeguard-20b | 0.070 | 0.200 | compare | 128K | 8K | N/A | N/A | ||
| GPT-oss-safeguard-120b | 0.150 | 0.600 | compare | 128K | 8K | N/A | N/A | ||
| Nvidia.nemotron Super 3 120B | 0.150 | 0.650 | compare | 256K | 33K | N/A | N/A | ||
| Voxtral Small 24B 2507 | 0.100 | 0.300 | compare | 128K | 8K | N/A | N/A | ||
| Voxtral Mini 3B 2507 | 0.040 | 0.040 | compare | 128K | 8K | N/A | N/A | ||
| Devstral 2 123B | 0.400 | 2.00 | compare | 256K | 8K | N/A | N/A | ||
| Llama4 Scout 17B Instruct V1 | 0.170 | 0.660 | compare | 128K | 4K | N/A | N/A | ||
| Llama4 Maverick 17B Instruct V1 | 0.240 | 0.970 | compare | 128K | 4K | N/A | N/A | ||
| Eu.twelvelabs.pegasus 1 2 V1 | N/A | 7.50 | compare | N/A | N/A | N/A | N/A | ||
| Eu.mistral.pixtral Large 2502 V1 | 2.00 | 6.00 | compare | 128K | 4K | N/A | N/A | ||
| Mixtral 8x7B Instruct V0 | 0.450 | 0.700 | compare | 32K | 8K | N/A | N/A | ||
| Mistral Large 2402 V1 | 8.00 | 24.00 | compare | 32K | 8K | N/A | N/A | ||
| Mistral 7B Instruct V0 | 0.150 | 0.200 | compare | 32K | 8K | N/A | N/A | ||
| Qwen3 Coder Next | 0.600 | 1.44 | compare | 262K | 8K | N/A | N/A | ||
| Minimax.minimax M2.1 | 0.360 | 1.44 | compare | 196K | 8K | N/A | N/A | ||
| Command Text V14 | N/A | N/A | compare | 4K | 4K | N/A | N/A | ||
| Command Light Text V14 | N/A | N/A | compare | 4K | 4K | N/A | N/A | ||
| Claude V1 | 8.00 | 24.00 | compare | 100K | 8K | N/A | N/A | ||
| Titan Text Premier V1 | 0.500 | 1.50 | compare | 42K | 32K | N/A | N/A | ||
| Titan Text Lite V1 | 0.300 | 0.400 | compare | 42K | 4K | N/A | N/A | ||
| Titan Text Express V1 | 1.30 | 1.70 | compare | 42K | 8K | N/A | N/A | ||
| Jamba Instruct V1 | 0.500 | 0.700 | compare | 70K | 4K | N/A | N/A | ||
| J2 Ultra V1 | 18.80 | 18.80 | compare | 8K | 8K | N/A | N/A | ||
| J2 Mid V1 | 12.50 | 12.50 | compare | 8K | 8K | N/A | N/A |