DistilRoBERTa Base is
Meta's language model. A distilled version of RoBERTa-base that retains strong text classification and NLU capabilities at roughly half the inference cost.
Capabilities
Input1/5
✓
·
·
·
·
Output1/5
✓
·
·
·
·
Capabilities0/13
·
·
·
·
·
·
·
·
·
·
·
·
·
Versions
| Version | Released | Context | Input / 1M | Output / 1M | Status |
|---|---|---|---|---|---|
| DistilRoBERTa Base | — | — | — | — | Current |
| Cross Encoder NLI DistilRoBERTa Base | — | — | — | — | Available |
| Cross Encoder NLI RoBERTa Base | — | — | — | — | Available |
| DistilRoBERTa Base | — | — | — | — | Available |
| RoBERTa Base | — | — | — | — | Available |
| RoBERTa Base OpenAI Detector | — | — | — | — | Available |
| RoBERTa Large | — | — | — | — | Available |
| RoBERTa Large OpenAI Detector | — | — | — | — | Available |