Pythia 70M Deduped is
EleutherAI's language model. A 70M-parameter LLM from EleutherAI's Pythia suite trained on a globally deduplicated Pile dataset, the smallest deduped model in the interpretability-focused series.
eleutherai-pythia-70m-deduped |
| Language |
| Active |
| Text |
| Text |
| 0.07B |
Capabilities
Input1/5
✓
·
·
·
·
Output1/5
✓
·
·
·
·
Capabilities0/13
·
·
·
·
·
·
·
·
·
·
·
·
·
Versions
| Version | Released | Context | Input / 1M | Output / 1M | Status |
|---|---|---|---|---|---|
| Pythia 70M Deduped | — | — | — | — | Current |
| Pythia 12B | — | 2K | $0.200 | $0.200 | Available |
| Pythia 160M Deduped | — | — | — | — | Available |