DistilBART XSum 1.1 is
Meta's language model. A distilled BART model fine-tuned on the XSum dataset for generating highly abstractive single-sentence summaries of news articles.
Capabilities
Input1/5
✓
·
·
·
·
Output1/5
✓
·
·
·
·
Capabilities0/13
·
·
·
·
·
·
·
·
·
·
·
·
·
Versions
| Version | Released | Context | Input / 1M | Output / 1M | Status |
|---|---|---|---|---|---|
| DistilBART XSum 1.1 | — | — | — | — | Current |
| BART Large CNN | — | — | — | — | Available |
| BART Large CNN SAMSum | — | — | — | — | Available |
| BART Large MNLI | — | — | — | — | Available |
| DistilBART CNN 12-6 | — | — | — | — | Available |
| DistilBART CNN 6-6 | — | — | — | — | Available |
| DistilBART XSum 12-3 | — | — | — | — | Available |
| Narsil Bart Large Mnli Opti | — | — | — | — | Available |
| Navteca Bart Large Mnli | — | — | — | — | Available |