Hugging Face logo

DistilRoBERTa Base


DistilRoBERTa Base is Hugging Face logoHugging Face's language model. A distilled version of RoBERTa Base offering faster inference and reduced memory footprint for question answering and text understanding tasks.
Spec
Canonical IDhuggingface-distilroberta-base
TypeLanguage
StatusActive
CreatorHugging FaceHugging Face
Input ModalitiesText
Output ModalitiesText

Capabilities

Input1/5
Text
Image·
Audio·
Video·
PDF·
Output1/5
Text
Image·
Audio·
Video·
Embedding·
Capabilities0/13
Reasoning·
Adaptive Reasoning·
Function Calling·
Parallel Function Calling·
Structured Outputs·
Native JSON Schema·
Web Search·
URL Context·
Computer Use·
Code Execution·
File Search·
Prompt Caching·
Assistant Prefill·

Versions

VersionReleasedContextInput / 1MOutput / 1MStatus
DistilRoBERTa BaseCurrent
Cross Encoder NLI DistilRoBERTa BaseAvailable
Cross Encoder NLI RoBERTa BaseAvailable
DistilRoBERTa BaseAvailable
RoBERTa BaseAvailable
RoBERTa Base OpenAI DetectorAvailable
RoBERTa LargeAvailable
RoBERTa Large OpenAI DetectorAvailable

Model IDs