Llama 3.1 Sonar Huge 128K Online
PerplexityTextDeprecated: 2025-02-22
Llama 3.1 Sonar Huge 128K Online is a text model from Perplexity with a context window of 127K tokens and max output of 127K tokens. Pricing starts at $5.00 per million input tokens and $5.00 per million output tokens.
Specifications
| Model Key | perplexity/llama-3.1-sonar-huge-128k-online |
| Provider | Perplexity |
| LiteLLM Provider | perplexity |
| Mode | Text |
| Canonical Name | llama-3.1-sonar-huge |
| Context Window | 127K tokens |
| Max Output | 127K tokens |
Capabilities
✗ Vision✗ Function Calling✗ Reasoning✗ JSON Schema✗ System Messages✗ Web Search✗ Prompt Caching✗ Audio Input✗ Audio Output
Pricing
| Type | Per 1K Tokens | Per 1M Tokens |
|---|---|---|
| Input Tokens | $0.0050 | $5.00 |
| Output Tokens | $0.0050 | $5.00 |
Similar Models
Models with similar capabilities and context window size.
Model | Provider | Mode | Input Price | Output Price | Context | Max Output | Vision | Functions |
|---|---|---|---|---|---|---|---|---|
| GigaChat 2 Lite | Gigachat | Text | N/A | N/A | 128K | 8K | no | yes |
| GigaChat 2 Max | Gigachat | Text | N/A | N/A | 128K | 8K | yes | yes |
| GigaChat 2 Pro | Gigachat | Text | N/A | N/A | 128K | 8K | yes | yes |
| Llama 3.1 Sonar Large 128K Online | Perplexity | Text | $1.00 | $1.00 | 127K | 127K | no | no |
| Llama 3.1 Sonar Small 128K Online | Perplexity | Text | $0.200 | $0.200 | 127K | 127K | no | no |
| Llama 3.2 90B Vision Instruct Maas | Google Vertex AI | Text | N/A | N/A | 128K | 2K | yes | no |
| Mistral 7B Instruct V0.3 | Ovhcloud | Text | $0.100 | $0.100 | 127K | 127K | no | yes |
| Sonar | Vercel Ai Gateway | Text | $1.00 | $1.00 | 127K | 8K | no | no |
| Sonar Reasoning | Vercel Ai Gateway | Text | $1.00 | $5.00 | 127K | 8K | no | no |
| Sonar Reasoning Pro | Vercel Ai Gateway | Text | $2.00 | $8.00 | 127K | 8K | no | no |