Pythia 160M Deduped is
EleutherAI's language model. A 160M-parameter LLM from EleutherAI's Pythia suite trained on a globally deduplicated version of the Pile dataset, designed for interpretability research.
eleutherai-pythia-160m-deduped |
| Language |
| Active |
| Text |
| Text |
| 0.16B |
Capabilities
Input1/5
✓
·
·
·
·
Output1/5
✓
·
·
·
·
Capabilities0/13
·
·
·
·
·
·
·
·
·
·
·
·
·
Versions
| Version | Released | Context | Input / 1M | Output / 1M | Status |
|---|---|---|---|---|---|
| Pythia 160M Deduped | — | — | — | — | Current |
| Pythia 12B | — | 2K | $0.200 | $0.200 | Available |
| Pythia 70M Deduped | — | — | — | — | Available |