DistilRoBERTa Base is
Hugging Face's language model. A distilled version of RoBERTa Base offering faster inference and reduced memory footprint for question answering and text understanding tasks.
huggingface-distilroberta-base |
| Language |
| Active |
| Text |
| Text |
Capabilities
Input1/5
✓
·
·
·
·
Output1/5
✓
·
·
·
·
Capabilities0/13
·
·
·
·
·
·
·
·
·
·
·
·
·
Versions
| Version | Released | Context | Input / 1M | Output / 1M | Status |
|---|---|---|---|---|---|
| DistilRoBERTa Base | — | — | — | — | Current |
| Cross Encoder NLI DistilRoBERTa Base | — | — | — | — | Available |
| Cross Encoder NLI RoBERTa Base | — | — | — | — | Available |
| DistilRoBERTa Base | — | — | — | — | Available |
| RoBERTa Base | — | — | — | — | Available |
| RoBERTa Base OpenAI Detector | — | — | — | — | Available |
| RoBERTa Large | — | — | — | — | Available |
| RoBERTa Large OpenAI Detector | — | — | — | — | Available |