AI Models Comparison
Compare AI model pricing and benchmarks across providers including OpenAI, Anthropic, Google, AWS Bedrock, Azure, Mistral, and more. Filter by model capabilities like vision, function calling, and reasoning support. Find the most cost-effective model for your use case. Currently tracking 1,909 models across 103 providers.
The data is based on LiteLLM, maintained by the open-source community, and benchmark data from Artificial Analysis. The latest update occurred on April 15, 2026 at 12:00 AM UTC
Model | Provider | Input Price, $ | Output Price, $ | Price Compare | Context | Max Output | Intelligence | Coding | |
|---|---|---|---|---|---|---|---|---|---|
| Gemini 3.1 Pro Preview | 2.00 | 12.00 | compare | 1.0M | 66K | 57.2#1 | 55.5#2 | ||
| GPT-5.2 | 1.75 | 14.00 | compare | 272K | 128K | 51.3#4 | 48.7#5 | ||
| Glm 5 | 0.800 | 2.56 | compare | 203K | 128K | 49.8#5 | 44.2#11 | ||
| GPT-5.2-codex | 1.75 | 14.00 | compare | 272K | 128K | 49.0#7 | 43.0#13 | ||
| Gemini 3 Pro Preview | 2.00 | 12.00 | compare | 1.0M | 66K | 48.4#9 | 46.5#8 | ||
| Kimi K2.5 | 0.600 | 3.00 | compare | 262K | 262K | 46.8#11 | 39.5#18 | ||
| Claude Opus 4.6 | 5.00 | 25.00 | compare | 1.0M | 128K | 46.5#12 | 47.6#6 | ||
| Qwen3.5 397B A17b | 0.600 | 3.60 | compare | 262K | 66K | 45.0#13 | 41.3#15 | ||
| GPT-5-codex | 1.25 | 10.00 | compare | 272K | 128K | 44.6#14 | 38.9#20 | ||
| GPT-5 | 1.25 | 10.00 | compare | 272K | 128K | 44.6#14 | 36.0#29 | ||
| Claude Sonnet 4.6 | 3.00 | 15.00 | compare | 1.0M | 128K | 44.4#16 | 46.4#9 | ||
| Claude Opus 4.5 | 5.00 | 25.00 | compare | 200K | 32K | 43.1#18 | 42.9#14 | ||
| Glm 4.7 | 0.400 | 1.50 | compare | 203K | 64K | 42.1#20 | 36.3#28 | ||
| Qwen3.5 27B | 0.300 | 2.40 | compare | 262K | 66K | 42.1#20 | 34.9#31 | ||
| Minimax M2.5 | 0.300 | 1.10 | compare | 197K | 66K | 41.9#22 | 37.4#24 | ||
| Qwen3.5 122B A10b | 0.400 | 2.00 | compare | 262K | 66K | 41.6#24 | 34.7#32 | ||
| Grok 4 | 3.00 | 15.00 | compare | 256K | 256K | 41.5#25 | 40.5#16 | ||
| GPT-5 mini | 0.250 | 2.00 | compare | 272K | 128K | 41.2#26 | 35.3#30 | ||
| Qwen3 235B A22b Thinking 2507 | 0.110 | 0.600 | compare | 262K | 262K | 39.9#29 | 30.5#38 | ||
| Qwen3.5 35B A3b | 0.250 | 2.00 | compare | 262K | 66K | 37.1#32 | 30.3#39 | ||
| Claude Sonnet 4.5 | 3.00 | 15.00 | compare | 1.0M | 1.0M | 37.1#32 | 33.5#35 | ||
| Minimax M2 | 0.255 | 1.02 | compare | 205K | 205K | 36.1#34 | 29.2#44 | ||
| Claude Opus 4.1 | 15.00 | 75.00 | compare | 200K | 32K | 36.0#35 | N/A | ||
| Gemini 3 Flash Preview | 0.500 | 3.00 | compare | 1.0M | 66K | 35.0#37 | 37.8#23 | ||
| GPT-oss-120b | 0.180 | 0.800 | compare | 131K | 33K | 33.3#39 | 28.6#45 | ||
| Claude Sonnet 4 | 3.00 | 15.00 | compare | 1.0M | 64K | 33.0#41 | 30.6#37 | ||
| Claude Opus 4 | 15.00 | 75.00 | compare | 200K | 32K | 33.0#41 | N/A | ||
| DeepSeek V3.2 | 0.280 | 0.400 | compare | 164K | 164K | 32.1#43 | 34.6#33 | ||
| Claude Haiku 4.5 | 1.00 | 5.00 | compare | 200K | 200K | 31.1#46 | 29.6#43 | ||
| o1 | 15.00 | 60.00 | compare | 200K | 100K | 30.8#48 | 20.5#71 | ||
| Claude 3.7 Sonnet | 3.00 | 15.00 | compare | 200K | 128K | 30.8#48 | 26.7#47 | ||
| Mimo V2 Flash | 0.090 | 0.290 | compare | 262K | 16K | 30.4#50 | 25.8#52 | ||
| Gemini 2.5 Pro | 1.25 | 10.00 | compare | 1.0M | 8K | 30.3#51 | 46.7#7 | ||
| Glm 4.6 | 0.400 | 1.75 | compare | 203K | 131K | 30.2#52 | 30.2#40 | ||
| Glm 4.7 Flash | 0.070 | 0.400 | compare | 200K | 32K | 30.1#53 | 25.9#50 | ||
| DeepSeek R1 | 0.550 | 2.19 | compare | 65K | 8K | 27.1#60 | 24.0#57 | ||
| GPT-5 nano | 0.050 | 0.400 | compare | 272K | 128K | 26.8#62 | 20.3#73 | ||
| GPT-4.1 | 2.00 | 8.00 | compare | 1.0M | 33K | 26.3#65 | 21.8#68 | ||
| o3 mini | 1.10 | 4.40 | compare | 128K | 66K | 25.9#67 | 17.9#85 | ||
| Qwen3 235B A22b 2507 | 0.071 | 0.100 | compare | 262K | 262K | 25.0#70 | 22.1#65 | ||
| GPT-oss-20b | 0.020 | 0.100 | compare | 131K | 33K | 24.5#72 | 18.5#80 | ||
| GPT-4.1 mini | 0.400 | 1.60 | compare | 1.0M | 33K | 22.9#80 | 18.5#80 | ||
| Gemini 2.5 Flash | 0.300 | 2.50 | compare | 1.0M | 8K | 20.6#88 | 17.8#86 | ||
| Gemini 2.0 Flash-001 | 0.100 | 0.400 | compare | 1.0M | 8K | 18.5#100 | 13.6#110 | ||
| GPT-4o | 2.50 | 10.00 | compare | 128K | 4K | 17.3#109 | 16.7#88 | ||
| Ministral 14B 2512 | 0.200 | 0.200 | compare | 262K | 262K | 16.0#120 | 10.9#127 | ||
| Claude 3.5 Sonnet | 3.00 | 15.00 | compare | 200K | 8K | 15.9#123 | 30.2#40 | ||
| Mistral Small 3.2 24B Instruct | 0.100 | 0.300 | compare | 32K | N/A | 15.1#129 | 13.3#112 | ||
| Ministral 8B 2512 | 0.150 | 0.150 | compare | 262K | 262K | 14.8#134 | 10.0#136 | ||
| Mistral Small 3.1 24B Instruct | 0.100 | 0.300 | compare | 32K | N/A | 14.5#137 | 13.9#107 | ||
| GPT-4.1 nano | 0.100 | 0.400 | compare | 1.0M | 33K | 13.0#153 | 11.2#122 | ||
| Qwen 2.5 Coder 32B Instruct | 0.180 | 0.180 | compare | 34K | 34K | 12.9#155 | N/A | ||
| GPT-4 | 30.00 | 60.00 | compare | 8K | N/A | 12.8#157 | 13.1#114 | ||
| Claude 3 Haiku | 0.250 | 1.25 | compare | 200K | N/A | 12.3#167 | 6.7#152 | ||
| Ministral 3B 2512 | 0.100 | 0.100 | compare | 131K | 131K | 11.2#176 | 4.8#158 | ||
| Mistral Large | 8.00 | 24.00 | compare | 32K | N/A | 9.9#194 | N/A | ||
| Mixtral 8x22B Instruct | 0.650 | 0.650 | compare | 66K | N/A | 9.8#195 | N/A | ||
| GPT-3.5 Turbo | 1.50 | 2.00 | compare | 4K | N/A | 9.0#201 | 10.7#131 | ||
| Llama 3 70B Instruct | 0.590 | 0.790 | compare | 8K | N/A | 8.9#204 | 6.8#151 | ||
| Mistral 7B Instruct | 0.130 | 0.130 | compare | 8K | N/A | 7.4#227 | N/A | ||
| Remm Slerp L2 13B | 1.88 | 1.88 | compare | 6K | N/A | N/A | N/A | ||
| Router | 0.850 | 3.40 | compare | 131K | 131K | N/A | N/A | ||
| Qwen3.5 Plus 02 15 | 0.400 | 2.40 | compare | 1.0M | 66K | N/A | N/A | ||
| Qwen3.5 Flash 02 23 | 0.100 | 0.400 | compare | 1.0M | 66K | N/A | N/A | ||
| Qwen3 Coder Plus | 1.00 | 5.00 | compare | 998K | 66K | N/A | N/A | ||
| Qwen3 Coder | 0.220 | 0.950 | compare | 262K | 262K | N/A | N/A | ||
| Qwen Vl Plus | 0.210 | 0.630 | compare | 8K | 2K | N/A | N/A | ||
| Free | N/A | N/A | compare | 200K | N/A | N/A | N/A | ||
| Bodybuilder | N/A | N/A | compare | 128K | N/A | N/A | N/A | ||
| Auto | N/A | N/A | compare | 2.0M | N/A | N/A | N/A | ||
| GPT-5.2-pro | 21.00 | 168.00 | compare | 272K | 128K | N/A | N/A | ||
| GPT-5.1-codex-max | 1.25 | 10.00 | compare | 400K | 128K | N/A | N/A | ||
| Mistral Large 2512 | 0.500 | 1.50 | compare | 262K | 262K | N/A | N/A | ||
| Devstral 2512 | 0.150 | 0.600 | compare | 262K | 66K | N/A | N/A | ||
| Minimax M2.1 | 0.270 | 1.20 | compare | 204K | 64K | N/A | N/A | ||
| Weaver | 5.63 | 5.63 | compare | 8K | N/A | N/A | N/A | ||
| Mythomax L2 13B | 1.88 | 1.88 | compare | 8K | N/A | N/A | N/A | ||
| DeepSeek R1 0528 | 0.500 | 2.15 | compare | 65K | 8K | N/A | N/A | ||
| DeepSeek Chat V3.1 | 0.200 | 0.800 | compare | 164K | 164K | N/A | N/A | ||
| DeepSeek Chat V3 0324 | 0.140 | 0.280 | compare | 66K | 8K | N/A | N/A | ||
| DeepSeek Chat | 0.140 | 0.280 | compare | 66K | 8K | N/A | N/A | ||
| Ui Tars 1.5 7B | 0.100 | 0.200 | compare | 131K | 2K | N/A | N/A |