AI Models Comparison
Compare AI model pricing and benchmarks across providers including OpenAI, Anthropic, Google, AWS Bedrock, Azure, Mistral, and more. Filter by model capabilities like vision, function calling, and reasoning support. Find the most cost-effective model for your use case. Currently tracking 1,872 models across 102 providers.
The data is based on LiteLLM, maintained by the open-source community, and benchmark data from Artificial Analysis. The latest update occurred on March 25, 2026 at 12:00 AM UTC
Model | Provider | Input Price, $ | Output Price, $ | Price Compare | Context | Max Output | Intelligence | Coding | |
|---|---|---|---|---|---|---|---|---|---|
| Qwen3 235B A22B Thinking 2507 | 0.300 | 2.90 | compare | 262K | 262K | 39.9#29 | 30.5#38 | ||
| GPT-oss-120b | 0.050 | 0.450 | compare | 131K | 131K | 33.3#39 | 28.6#45 | ||
| Claude 4 Sonnet | 3.30 | 16.50 | compare | 200K | 200K | 33.0#41 | 30.6#37 | ||
| Claude 4 Opus | 16.50 | 82.50 | compare | 200K | 200K | 33.0#41 | N/A | ||
| Kimi K2 Instruct 0905 | 0.500 | 2.00 | compare | 262K | 262K | 30.9#47 | 25.9#50 | ||
| Claude 3 7 Sonnet | 3.30 | 16.50 | compare | 200K | 200K | 30.8#48 | 26.7#47 | ||
| Gemini 2.5 Pro | 1.25 | 10.00 | compare | 1.0M | 1.0M | 30.3#51 | 46.7#7 | ||
| DeepSeek V3.1 Terminus | 0.270 | 1.00 | compare | 164K | 164K | 28.5#57 | 31.9#36 | ||
| DeepSeek V3.1 | 0.270 | 1.00 | compare | 164K | 164K | 28.1#59 | 28.4#46 | ||
| DeepSeek R1 | 0.700 | 2.40 | compare | 164K | 164K | 27.1#60 | 24.0#57 | ||
| Qwen3 Next 80B A3B Thinking | 0.140 | 1.40 | compare | 262K | 262K | 26.7#63 | 19.5#75 | ||
| GLM 4.5 | 0.400 | 1.60 | compare | 131K | 131K | 26.4#64 | 26.3#49 | ||
| Kimi K2 Instruct | 0.500 | 2.00 | compare | 131K | 131K | 26.3#65 | 22.1#65 | ||
| Qwen3 235B A22B Instruct 2507 | 0.090 | 0.600 | compare | 262K | 262K | 25.0#70 | 22.1#65 | ||
| Qwen3 Coder 480B A35B Instruct | 0.400 | 1.60 | compare | 262K | 262K | 24.8#71 | 24.6#56 | ||
| GPT-oss-20b | 0.040 | 0.150 | compare | 131K | 131K | 24.5#72 | 18.5#80 | ||
| DeepSeek V3 0324 | 0.250 | 0.880 | compare | 164K | 164K | 22.3#82 | 22.0#67 | ||
| Gemini 2.5 Flash | 0.300 | 2.50 | compare | 1.0M | 1.0M | 20.6#87 | 17.8#86 | ||
| Qwen3 Next 80B A3B Instruct | 0.140 | 1.40 | compare | 262K | 262K | 20.1#89 | 15.3#97 | ||
| QwQ 32B | 0.150 | 0.400 | compare | 131K | 131K | 19.7#92 | N/A | ||
| Gemini 2.0 Flash-001 | 0.100 | 0.400 | compare | 1.0M | 1.0M | 18.5#99 | 13.6#110 | ||
| Hermes 3 Llama 3.1 405B | 1.00 | 1.00 | compare | 131K | 131K | 17.6#106 | 18.1#84 | ||
| DeepSeek R1 Distill Qwen 32B | 0.270 | 0.270 | compare | 131K | 131K | 17.2#109 | N/A | ||
| Qwen3 235B A22B | 0.180 | 0.540 | compare | 41K | 41K | 17.0#112 | 14.0#105 | ||
| DeepSeek V3 | 0.380 | 0.890 | compare | 164K | 164K | 16.5#115 | 16.4#90 | ||
| DeepSeek R1 Distill Llama 70B | 0.200 | 0.600 | compare | 131K | 131K | 16.0#119 | 11.4#120 | ||
| Qwen2.5 72B Instruct | 0.120 | 0.390 | compare | 33K | 33K | 15.6#124 | 11.9#119 | ||
| Mistral Small 3.2 24B Instruct 2506 | 0.075 | 0.200 | compare | 128K | 128K | 15.1#128 | 13.3#112 | ||
| Llama 3.3 Nemotron Super 49B V1.5 | 0.100 | 0.400 | compare | 131K | 131K | 14.6#135 | 10.5#133 | ||
| Qwen3 32B | 0.100 | 0.280 | compare | 41K | 41K | 14.5#136 | N/A | ||
| Llama 3.3 70B Instruct | 0.230 | 0.400 | compare | 131K | 131K | 14.5#136 | 10.7#130 | ||
| Llama 3.1 Nemotron 70B Instruct | 0.600 | 0.600 | compare | 131K | 131K | 13.4#147 | 10.8#128 | ||
| NVIDIA Nemotron Nano 9B V2 | 0.040 | 0.160 | compare | 131K | 131K | 13.2#149 | 7.5#146 | ||
| Qwen3 30B A3B | 0.080 | 0.290 | compare | 41K | 41K | 12.5#161 | 13.3#112 | ||
| Meta Llama 3.1 70B Instruct | 0.400 | 0.400 | compare | 131K | 131K | 12.5#161 | 10.9#126 | ||
| Meta Llama 3.1 8B Instruct | 0.030 | 0.050 | compare | 131K | 131K | 11.8#174 | 4.9#157 | ||
| Hermes 3 Llama 3.1 70B | 0.300 | 0.300 | compare | 131K | 131K | 10.6#179 | N/A | ||
| Phi 4 | 0.070 | 0.140 | compare | 16K | 16K | 10.4#183 | 11.2#121 | ||
| Gemma 3 27B It | 0.090 | 0.160 | compare | 131K | 131K | 10.3#184 | 9.6#138 | ||
| Llama 3.2 3B Instruct | 0.020 | 0.020 | compare | 131K | 131K | 9.7#195 | N/A | ||
| Gemma 3 12B It | 0.050 | 0.100 | compare | 131K | 131K | 8.8#204 | 6.3#154 | ||
| Llama 3.2 11B Vision Instruct | 0.049 | 0.049 | compare | 131K | 131K | 8.7#208 | 4.3#159 | ||
| Qwen3 14B | 0.060 | 0.240 | compare | 41K | 41K | 7.4#226 | N/A | ||
| Meta Llama 3 8B Instruct | 0.030 | 0.060 | compare | 8K | 8K | 6.4#232 | 4.0#161 | ||
| Gemma 3 4B It | 0.040 | 0.080 | compare | 131K | 131K | 6.3#233 | 2.9#167 | ||
| L3.3 70B Euryale V2.3 | 0.650 | 0.750 | compare | 131K | 131K | N/A | N/A | ||
| L3.1 70B Euryale V2.2 | 0.650 | 0.750 | compare | 131K | 131K | N/A | N/A | ||
| L3 8B Lunaris V1 Turbo | 0.040 | 0.050 | compare | 8K | 8K | N/A | N/A | ||
| Qwen3 Coder 480B A35B Instruct Turbo | 0.290 | 1.20 | compare | 262K | 262K | N/A | N/A | ||
| Qwen2.5 VL 32B Instruct | 0.200 | 0.600 | compare | 128K | 128K | N/A | N/A | ||
| Qwen2.5 7B Instruct | 0.040 | 0.100 | compare | 33K | 33K | N/A | N/A | ||
| Mixtral 8x7B Instruct V0.1 | 0.400 | 0.400 | compare | 33K | 33K | N/A | N/A | ||
| Mistral Small 24B Instruct 2501 | 0.050 | 0.080 | compare | 33K | 33K | N/A | N/A | ||
| Mistral Nemo Instruct 2407 | 0.020 | 0.040 | compare | 131K | 131K | N/A | N/A | ||
| WizardLM 2 8x22B | 0.480 | 0.480 | compare | 66K | 66K | N/A | N/A | ||
| Llama Guard 4 12B | 0.180 | 0.180 | compare | 164K | 164K | N/A | N/A | ||
| Llama Guard 3 8B | 0.055 | 0.055 | compare | 131K | 131K | N/A | N/A | ||
| Llama 4 Scout 17B 16E Instruct | 0.080 | 0.300 | compare | 328K | 328K | N/A | N/A | ||
| Llama 4 Maverick 17B 128E Instruct FP8 | 0.150 | 0.600 | compare | 1.0M | 1.0M | N/A | N/A | ||
| MythoMax L2 13B | 0.080 | 0.090 | compare | 4K | 4K | N/A | N/A | ||
| DeepSeek R1 Turbo | 1.00 | 3.00 | compare | 41K | 41K | N/A | N/A | ||
| DeepSeek R1 0528 Turbo | 1.00 | 3.00 | compare | 33K | 33K | N/A | N/A | ||
| DeepSeek R1 0528 | 0.500 | 2.15 | compare | 164K | 164K | N/A | N/A | ||
| OlmOCR 7B 0725 FP8 | 0.270 | 1.50 | compare | 16K | 16K | N/A | N/A |