DistilBART XSum 12-3 is
Hugging Face's language model. A distilled BART model fine-tuned on the XSum dataset for abstractive text summarization, using a 12-encoder/3-decoder layer configuration.
huggingface-distilbart-xsum-12-3 |
| Language |
| Active |
| Text |
| Text |
Capabilities
Input1/5
✓
·
·
·
·
Output1/5
✓
·
·
·
·
Capabilities0/13
·
·
·
·
·
·
·
·
·
·
·
·
·
Versions
| Version | Released | Context | Input / 1M | Output / 1M | Status |
|---|---|---|---|---|---|
| DistilBART XSum 12-3 | — | — | — | — | Current |
| BART Large CNN | — | — | — | — | Available |
| BART Large CNN SAMSum | — | — | — | — | Available |
| BART Large MNLI | — | — | — | — | Available |
| DistilBART CNN 12-6 | — | — | — | — | Available |
| DistilBART CNN 6-6 | — | — | — | — | Available |
| DistilBART XSum 1.1 | — | — | — | — | Available |
| Narsil Bart Large Mnli Opti | — | — | — | — | Available |
| Navteca Bart Large Mnli | — | — | — | — | Available |