AI Models Comparison
Compare AI model pricing and benchmarks across providers including OpenAI, Anthropic, Google, AWS Bedrock, Azure, Mistral, and more. Filter by model capabilities like vision, function calling, and reasoning support. Find the most cost-effective model for your use case. Currently tracking 1,872 models across 102 providers.
The data is based on LiteLLM, maintained by the open-source community, and benchmark data from Artificial Analysis. The latest update occurred on March 25, 2026 at 12:00 AM UTC
Model | Provider | Input Price, $ | Output Price, $ | Price Compare | Context | Max Output | Intelligence | Coding | |
|---|---|---|---|---|---|---|---|---|---|
| Glm 4.7 | 0.600 | 2.20 | compare | 205K | 131K | 42.1#20 | 36.3#28 | ||
| Minimax M2 | 0.300 | 1.20 | compare | 205K | 131K | 36.1#34 | 29.2#44 | ||
| Kat Coder Pro | 0.300 | 1.20 | compare | 256K | 128K | 36.0#35 | 18.3#82 | ||
| GPT-oss-120b | 0.050 | 0.250 | compare | 131K | 33K | 33.3#39 | 28.6#45 | ||
| DeepSeek V3.2 | 0.269 | 0.400 | compare | 164K | 66K | 32.1#43 | 34.6#33 | ||
| Qwen3 Max | 2.11 | 8.45 | compare | 262K | 66K | 31.4#45 | 26.4#48 | ||
| Kimi K2 0905 | 0.600 | 2.50 | compare | 262K | 262K | 30.9#47 | 25.9#50 | ||
| Mimo V2 Flash | 0.100 | 0.300 | compare | 262K | 32K | 30.4#50 | 25.8#52 | ||
| Glm 4.6 | 0.550 | 2.20 | compare | 205K | 131K | 30.2#52 | 30.2#40 | ||
| Qwen3 235B A22b Thinking 2507 | 0.300 | 3.00 | compare | 131K | 33K | 29.5#54 | 23.2#62 | ||
| DeepSeek V3.1 Terminus | 0.270 | 1.00 | compare | 131K | 33K | 28.5#57 | 31.9#36 | ||
| DeepSeek V3.1 | 0.270 | 1.00 | compare | 131K | 33K | 28.1#59 | 28.4#46 | ||
| Qwen3 Next 80B A3b Thinking | 0.150 | 1.50 | compare | 131K | 33K | 26.7#63 | 19.5#75 | ||
| Glm 4.5 | 0.600 | 2.20 | compare | 131K | 98K | 26.4#64 | 26.3#49 | ||
| Kimi K2 Instruct | 0.570 | 2.30 | compare | 131K | 131K | 26.3#65 | 22.1#65 | ||
| Qwen3 235B A22b Instruct 2507 | 0.090 | 0.580 | compare | 131K | 16K | 25.0#70 | 22.1#65 | ||
| Qwen3 Coder 480B A35b Instruct | 0.300 | 1.30 | compare | 262K | 66K | 24.8#71 | 24.6#56 | ||
| GPT-oss-20b | 0.040 | 0.150 | compare | 131K | 33K | 24.5#72 | 18.5#80 | ||
| Kimi K2 Thinking | 0.600 | 2.50 | compare | 262K | 262K | 24.1#73 | 15.5#96 | ||
| Glm 4.5 Air | 0.130 | 0.850 | compare | 131K | 98K | 23.2#76 | 23.8#58 | ||
| DeepSeek V3 0324 | 0.270 | 1.12 | compare | 164K | 164K | 22.3#82 | 22.0#67 | ||
| Minimax M1 80K | 0.550 | 2.20 | compare | 1.0M | 40K | 20.9#85 | 14.1#104 | ||
| Qwen3 Vl 235B A22b Instruct | 0.300 | 1.50 | compare | 131K | 33K | 20.8#86 | 16.5#89 | ||
| Qwen3 Next 80B A3b Instruct | 0.150 | 1.50 | compare | 131K | 33K | 20.1#89 | 15.3#97 | ||
| Qwen3 Coder 30B A3b Instruct | 0.070 | 0.270 | compare | 160K | 33K | 20.0#90 | 19.4#78 | ||
| DeepSeek R1 Distill Qwen 32B | 0.300 | 0.300 | compare | 64K | 32K | 17.2#109 | N/A | ||
| Glm 4.6v | 0.300 | 0.900 | compare | 131K | 33K | 17.1#111 | 11.1#123 | ||
| Qwen3 235B A22b Fp8 | 0.200 | 0.800 | compare | 41K | 20K | 17.0#112 | 14.0#105 | ||
| Qwen3 Vl 8B Instruct | 0.080 | 0.500 | compare | 131K | 33K | 16.7#114 | 9.8#137 | ||
| DeepSeek V3 Turbo | 0.400 | 1.30 | compare | 64K | 16K | 16.5#115 | 16.4#90 | ||
| Qwen3 Vl 30B A3b Instruct | 0.200 | 0.700 | compare | 131K | 33K | 16.1#118 | 14.3#102 | ||
| DeepSeek R1 Distill Llama 70B | 0.800 | 0.800 | compare | 8K | 8K | 16.0#119 | 11.4#120 | ||
| DeepSeek R1 Distill Qwen 14B | 0.150 | 0.150 | compare | 33K | 16K | 15.8#123 | N/A | ||
| Qwen 2.5 72B Instruct | 0.380 | 0.400 | compare | 32K | 8K | 15.6#124 | 11.9#119 | ||
| ERNIE 4.5 300B A47b Paddle | 0.280 | 1.10 | compare | 123K | 12K | 15.0#130 | 14.5#100 | ||
| Qwen3 32B Fp8 | 0.100 | 0.450 | compare | 41K | 20K | 14.5#136 | N/A | ||
| Llama 3.3 70B Instruct | 0.135 | 0.400 | compare | 131K | 120K | 14.5#136 | 10.7#130 | ||
| Glm 4.5v | 0.600 | 1.80 | compare | 66K | 16K | 12.7#157 | 10.8#128 | ||
| Qwen3 4B Fp8 | 0.030 | 0.030 | compare | 128K | 20K | 12.5#161 | N/A | ||
| Qwen3 30B A3b Fp8 | 0.090 | 0.450 | compare | 41K | 20K | 12.5#161 | 13.3#112 | ||
| Llama 3.1 8B Instruct | 0.020 | 0.050 | compare | 16K | 16K | 11.8#174 | 4.9#157 | ||
| Qwen3 8B Fp8 | 0.035 | 0.138 | compare | 128K | 20K | 10.6#179 | 7.1#150 | ||
| Gemma 3 27B It | 0.119 | 0.200 | compare | 98K | 16K | 10.3#184 | 9.6#138 | ||
| Llama 3.2 3B Instruct | 0.030 | 0.050 | compare | 33K | 32K | 9.7#195 | N/A | ||
| Llama 3 70B Instruct | 0.510 | 0.740 | compare | 8K | 8K | 8.9#203 | 6.8#151 | ||
| Gemma 3 12B It | 0.050 | 0.100 | compare | 131K | 8K | 8.8#204 | 6.3#154 | ||
| Llama 3 8B Instruct | 0.040 | 0.040 | compare | 8K | 8K | 6.4#232 | 4.0#161 | ||
| Autoglm Phone 9B Multilingual | 0.035 | 0.138 | compare | 66K | 66K | N/A | N/A | ||
| R1v4 Lite | 0.200 | 0.600 | compare | 262K | 66K | N/A | N/A | ||
| L31 70B Euryale V2.2 | 1.48 | 1.48 | compare | 8K | 8K | N/A | N/A | ||
| L3 8B Stheno V3.2 | 0.050 | 0.050 | compare | 8K | 32K | N/A | N/A | ||
| L3 8B Lunaris | 0.050 | 0.050 | compare | 8K | 8K | N/A | N/A | ||
| L3 70B Euryale V2.1 | 1.48 | 1.48 | compare | 8K | 8K | N/A | N/A | ||
| Qwen2.5 Vl 72B Instruct | 0.800 | 0.800 | compare | 33K | 33K | N/A | N/A | ||
| Qwen2.5 7B Instruct | 0.070 | 0.070 | compare | 32K | 32K | N/A | N/A | ||
| Qwen Mt Plus | 0.250 | 0.750 | compare | 16K | 8K | N/A | N/A | ||
| Paddleocr Vl | 0.020 | 0.020 | compare | 16K | 16K | N/A | N/A | ||
| Hermes 2 Pro Llama 3 8B | 0.140 | 0.140 | compare | 8K | 8K | N/A | N/A | ||
| Mistral Nemo | 0.040 | 0.170 | compare | 60K | 16K | N/A | N/A | ||
| Minimax M2.1 | 0.300 | 1.20 | compare | 205K | 131K | N/A | N/A | ||
| Wizardlm 2 8x22B | 0.620 | 0.620 | compare | 66K | 8K | N/A | N/A | ||
| Llama 4 Scout 17B 16e Instruct | 0.180 | 0.590 | compare | 131K | 131K | N/A | N/A | ||
| Llama 4 Maverick 17B 128e Instruct Fp8 | 0.270 | 0.850 | compare | 1.0M | 8K | N/A | N/A | ||
| Mythomax L2 13B | 0.090 | 0.090 | compare | 4K | 3K | N/A | N/A | ||
| DeepSeek R1 Turbo | 0.700 | 2.50 | compare | 64K | 16K | N/A | N/A | ||
| DeepSeek R1 0528 | 0.700 | 2.50 | compare | 164K | 33K | N/A | N/A | ||
| DeepSeek Prover V2 671B | 0.700 | 2.50 | compare | 160K | 160K | N/A | N/A | ||
| DeepSeek OCR | 0.030 | 0.030 | compare | 8K | 8K | N/A | N/A | ||
| ERNIE 4.5 Vl 424B A47b | 0.420 | 1.25 | compare | 123K | 16K | N/A | N/A | ||
| ERNIE 4.5 Vl 28B A3b | 0.140 | 0.560 | compare | 30K | 8K | N/A | N/A | ||
| ERNIE 4.5 21B A3b | 0.070 | 0.280 | compare | 120K | 8K | N/A | N/A | ||
| Baichuan M2 32B | 0.070 | 0.070 | compare | 131K | 131K | N/A | N/A |