Meta logo

RoBERTa Base


RoBERTa Base is Meta logoMeta's language model. A robustly optimized BERT pretraining approach at base scale, widely used for text classification, question answering, and NLU benchmarks.
Spec
Canonical IDmeta-roberta-base
TypeLanguage
StatusActive
CreatorMetaMeta
Input ModalitiesText
Output ModalitiesText

Capabilities

Input1/5
Text
Image·
Audio·
Video·
PDF·
Output1/5
Text
Image·
Audio·
Video·
Embedding·
Capabilities0/13
Reasoning·
Adaptive Reasoning·
Function Calling·
Parallel Function Calling·
Structured Outputs·
Native JSON Schema·
Web Search·
URL Context·
Computer Use·
Code Execution·
File Search·
Prompt Caching·
Assistant Prefill·

Versions

VersionReleasedContextInput / 1MOutput / 1MStatus
RoBERTa BaseCurrent
Cross Encoder NLI DistilRoBERTa BaseAvailable
Cross Encoder NLI RoBERTa BaseAvailable
DistilRoBERTa BaseAvailable
DistilRoBERTa BaseAvailable
RoBERTa Base OpenAI DetectorAvailable
RoBERTa LargeAvailable
RoBERTa Large OpenAI DetectorAvailable

Model IDs