Llama 3.1 Sonar Large 128K Online
Llama 3.1 Sonar Large 128K Online is a text model from
Perplexity with a context window of 127K tokens and max output of 127K tokens.
Capabilities
✗ Vision✗ Function Calling✗ Reasoning✗ JSON Schema✗ System Messages✗ Web Search✗ Prompt Caching✗ Audio Input✗ Audio Output
Specifications
| Model Key | perplexity/llama-3.1-sonar-large-128k-online |
| Provider | |
| Provider ID | perplexity |
| Mode | Text |
| Canonical Name | llama-3.1-sonar-large-128k |
| Context Window | 127K tokens |
| Max Output | 127K tokens |
Pricing
| Type | Per 1K Tokens | Per 1M Tokens |
|---|---|---|
| Input Tokens | N/A | N/A |
| Output Tokens | N/A | N/A |
Benchmarks
No benchmark data is available for this model.
All Variants
All available versions, regions, and API endpoints for Llama 3.1 Sonar Large 128K Online.
Model Key | Provider | Mode | Input Price, $ | Output Price, $ | Context | Max Output | Vision | Functions |
|---|---|---|---|---|---|---|---|---|
| perplexity/llama-3.1-sonar-large-128k-chat | Text | N/A | N/A | 131K | 131K | no | no | |
| perplexity/llama-3.1-sonar-large-128k-online | Text | N/A | N/A | 127K | 127K | no | no |