AI Models Comparison
Compare AI model pricing and benchmarks across providers including OpenAI, Anthropic, Google, AWS Bedrock, Azure, Mistral, and more. Filter by model capabilities like vision, function calling, and reasoning support. Find the most cost-effective model for your use case. Currently tracking 1,909 models across 103 providers.
The data is based on LiteLLM, maintained by the open-source community, and benchmark data from Artificial Analysis. The latest update occurred on April 15, 2026 at 12:00 AM UTC
Model | Provider | Input Price, $ | Output Price, $ | Price Compare | Context | Max Output | Intelligence | Coding | |
|---|---|---|---|---|---|---|---|---|---|
| Claude Opus 4.6 | 5.00 | 25.00 | compare | 200K | 64K | 46.5#12 | 47.6#6 | ||
| Claude Opus 4.5 | 5.00 | 25.00 | compare | 200K | 64K | 43.1#18 | 42.9#14 | ||
| Grok 4 | 3.00 | 15.00 | compare | 256K | 256K | 41.5#25 | 40.5#16 | ||
| o3 | 2.00 | 8.00 | compare | 200K | 100K | 38.4#31 | 38.4#21 | ||
| Claude Sonnet 4.5 | 3.00 | 15.00 | compare | 1.0M | 64K | 37.1#32 | 33.5#35 | ||
| Claude Opus 4.1 | 15.00 | 75.00 | compare | 200K | 32K | 36.0#35 | N/A | ||
| o4 mini | 1.10 | 4.40 | compare | 200K | 100K | 33.1#40 | 25.6#53 | ||
| Claude 4 Sonnet | 3.00 | 15.00 | compare | 200K | 64K | 33.0#41 | 30.6#37 | ||
| Claude 4 Opus | 15.00 | 75.00 | compare | 200K | 32K | 33.0#41 | N/A | ||
| Claude Haiku 4.5 | 1.00 | 5.00 | compare | 200K | 64K | 31.1#46 | 29.6#43 | ||
| o1 | 15.00 | 60.00 | compare | 200K | 100K | 30.8#48 | 20.5#71 | ||
| Claude 3 7 Sonnet | 3.00 | 15.00 | compare | 200K | 64K | 30.8#48 | 26.7#47 | ||
| Gemini 2.5 Pro | 2.50 | 10.00 | compare | 1.0M | 66K | 30.3#51 | 46.7#7 | ||
| Glm 4.6 | 0.450 | 1.80 | compare | 200K | 200K | 30.2#52 | 30.2#40 | ||
| DeepSeek R1 | 0.550 | 2.19 | compare | 128K | 8K | 27.1#60 | 24.0#57 | ||
| Glm 4.5 | 0.600 | 2.20 | compare | 131K | 131K | 26.4#64 | 26.3#49 | ||
| GPT-4.1 | 2.00 | 8.00 | compare | 1.0M | 33K | 26.3#65 | 21.8#68 | ||
| Kimi K2 | 0.550 | 2.20 | compare | 131K | 16K | 26.3#65 | 22.1#65 | ||
| o3 mini | 1.10 | 4.40 | compare | 200K | 100K | 25.9#67 | 17.9#85 | ||
| Grok 3 | 3.00 | 15.00 | compare | 131K | 131K | 25.2#69 | 19.8#74 | ||
| Glm 4.5 Air | 0.200 | 1.10 | compare | 128K | 96K | 23.2#76 | 23.8#58 | ||
| GPT-4.1 mini | 0.400 | 1.60 | compare | 1.0M | 33K | 22.9#80 | 18.5#80 | ||
| Gemini 2.5 Flash | 0.300 | 2.50 | compare | 1.0M | 66K | 20.6#88 | 17.8#86 | ||
| Magistral Medium | 2.00 | 5.00 | compare | 128K | 64K | 18.8#96 | 16.0#91 | ||
| Claude 3.5 Haiku | 0.800 | 4.00 | compare | 200K | 8K | 18.7#98 | 10.7#131 | ||
| Gemini 2.0 Flash | 0.150 | 0.600 | compare | 1.0M | 8K | 18.5#100 | 13.6#110 | ||
| Llama 4 Maverick | 0.200 | 0.600 | compare | 131K | 8K | 18.4#101 | 15.6#94 | ||
| Claude 3 Opus | 15.00 | 75.00 | compare | 200K | 4K | 18.0#162 | 19.5#75 | ||
| Sonar Reasoning | 1.00 | 5.00 | compare | 127K | 8K | 17.9#106 | N/A | ||
| GPT-4o | 2.50 | 10.00 | compare | 128K | 16K | 17.3#109 | 16.7#88 | ||
| Qwen 3 235B | 0.200 | 0.600 | compare | 41K | 16K | 17.0#113 | 14.0#105 | ||
| Magistral Small | 0.500 | 1.50 | compare | 128K | 64K | 16.8#114 | 11.1#124 | ||
| DeepSeek V3 | 0.900 | 0.900 | compare | 128K | 8K | 16.5#116 | 16.4#90 | ||
| DeepSeek R1 Distill Llama 70B | 0.750 | 0.990 | compare | 131K | 131K | 16.0#120 | 11.4#121 | ||
| Claude 3 5 Sonnet | 3.00 | 15.00 | compare | 200K | 8K | 15.9#123 | 30.2#40 | ||
| Sonar | 1.00 | 1.00 | compare | 127K | 8K | 15.5#126 | N/A | ||
| Sonar Reasoning Pro | 2.00 | 8.00 | compare | 127K | 8K | 15.2#127 | N/A | ||
| Devstral Small | 0.070 | 0.280 | compare | 128K | 128K | 15.2#127 | 12.1#119 | ||
| Gemini 2.0 Flash-Lite | 0.075 | 0.300 | compare | 1.0M | 8K | 14.7#135 | N/A | ||
| Llama 3.3 70B | 0.720 | 0.720 | compare | 128K | 8K | 14.5#137 | 10.7#131 | ||
| Qwen 3 32B | 0.100 | 0.300 | compare | 41K | 16K | 14.5#137 | N/A | ||
| Llama 4 Scout | 0.100 | 0.300 | compare | 131K | 8K | 13.5#145 | 6.7#152 | ||
| Command A | 2.50 | 10.00 | compare | 256K | 8K | 13.5#145 | 9.9#137 | ||
| Nova Pro | 0.800 | 3.20 | compare | 300K | 8K | 13.5#145 | 11.0#126 | ||
| GPT-4.1 nano | 0.100 | 0.400 | compare | 1.0M | 33K | 13.0#153 | 11.2#122 | ||
| GPT-4 Turbo | 10.00 | 30.00 | compare | 128K | 4K | 12.8#157 | 13.1#114 | ||
| Nova Lite | 0.060 | 0.240 | compare | 300K | 8K | 12.7#158 | 5.1#156 | ||
| GPT-4o mini | 0.150 | 0.600 | compare | 128K | 16K | 12.6#161 | N/A | ||
| Llama 3.1 70B | 0.720 | 0.720 | compare | 128K | 8K | 12.5#162 | 10.9#127 | ||
| Qwen 3 30B | 0.100 | 0.300 | compare | 41K | 16K | 12.5#162 | 13.3#112 | ||
| Claude 3 Haiku | 0.250 | 1.25 | compare | 200K | 4K | 12.3#167 | 6.7#152 | ||
| Mistral Saba 24B | 0.790 | 0.790 | compare | 33K | 33K | 12.1#169 | N/A | ||
| Llama 3.2 90B | 0.720 | 0.720 | compare | 128K | 8K | 11.9#174 | N/A | ||
| Llama 3.1 8B | 0.050 | 0.080 | compare | 131K | 131K | 11.8#175 | 4.9#157 | ||
| Nova Micro | 0.035 | 0.140 | compare | 128K | 8K | 10.3#185 | 4.1#160 | ||
| Mistral Small | 0.100 | 0.300 | compare | 32K | 4K | 10.2#188 | N/A | ||
| Mistral Large | 2.00 | 6.00 | compare | 32K | 4K | 9.9#194 | N/A | ||
| Mixtral 8x22B Instruct | 1.20 | 1.20 | compare | 66K | 2K | 9.8#195 | N/A | ||
| Llama 3.2 3B | 0.150 | 0.150 | compare | 128K | 8K | 9.7#196 | N/A | ||
| GPT-3.5 Turbo | 0.500 | 1.50 | compare | 16K | 4K | 9.0#201 | 10.7#131 | ||
| Llama 3 70B | 0.590 | 0.790 | compare | 8K | 8K | 8.9#204 | 6.8#151 | ||
| Llama 3.2 11B | 0.160 | 0.160 | compare | 128K | 8K | 8.7#209 | 4.3#159 | ||
| Command R+ | 2.50 | 10.00 | compare | 128K | 4K | 8.3#216 | N/A | ||
| Command R | 0.150 | 0.600 | compare | 128K | 4K | 7.4#227 | N/A | ||
| Qwen 3 14B | 0.080 | 0.240 | compare | 41K | 16K | 7.4#227 | N/A | ||
| Llama 3 8B | 0.050 | 0.080 | compare | 8K | 8K | 6.4#233 | 4.0#161 | ||
| Llama 3.2 1B | 0.100 | 0.100 | compare | 128K | 8K | 6.3#234 | 0.6#173 | ||
| Grok 3 Mini Fast | 0.600 | 4.00 | compare | 131K | 131K | N/A | N/A | ||
| Grok 3 Mini | 0.300 | 0.500 | compare | 131K | 131K | N/A | N/A | ||
| Grok 3 Fast | 5.00 | 25.00 | compare | 131K | 131K | N/A | N/A | ||
| Grok 2 Vision | 2.00 | 10.00 | compare | 33K | 33K | N/A | N/A | ||
| Grok 2 | 2.00 | 10.00 | compare | 131K | 4K | N/A | N/A | ||
| V0 1.5 Md | 3.00 | 15.00 | compare | 128K | 33K | N/A | N/A | ||
| V0 1.0 Md | 3.00 | 15.00 | compare | 128K | 32K | N/A | N/A | ||
| Morph V3 Large | 0.900 | 1.90 | compare | 33K | 16K | N/A | N/A | ||
| Morph V3 Fast | 0.800 | 1.20 | compare | 33K | 16K | N/A | N/A | ||
| Pixtral Large | 2.00 | 6.00 | compare | 128K | 4K | N/A | N/A | ||
| Pixtral 12B | 0.150 | 0.150 | compare | 128K | 4K | N/A | N/A | ||
| Mistral Embed | 0.100 | N/A | compare | N/A | N/A | N/A | N/A | ||
| Ministral 8B | 0.100 | 0.100 | compare | 128K | 4K | N/A | N/A | ||
| Ministral 3B | 0.040 | 0.040 | compare | 128K | 4K | N/A | N/A | ||
| Codestral Embed | 0.150 | N/A | compare | N/A | N/A | N/A | N/A | ||
| Codestral | 0.300 | 0.900 | compare | 256K | 4K | N/A | N/A | ||
| Mercury Coder Small | 0.250 | 1.00 | compare | 32K | 16K | N/A | N/A | ||
| Gemma 2 9B | 0.200 | 0.200 | compare | 8K | 8K | N/A | N/A | ||
| Embed V4.0 | 0.120 | N/A | compare | N/A | N/A | N/A | N/A | ||
| Titan Embed Text V2 | 0.020 | N/A | compare | N/A | N/A | N/A | N/A | ||
| Qwen3 Coder | 0.400 | 1.60 | compare | 262K | 67K | N/A | N/A |