Disclaimer: The team releasing BART did not write a model card for this model so this model card has been written by the Hugging Face team.
Model description
BART is a transformer encoder-decoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder. BART is pre-trained by (1) corrupting text with an arbitrary noising function, and (2) learning a model to reconstruct the original text.
BART is particularly effective when fine-tuned for text generation (e.g. summarization, translation) but also works well for comprehension tasks (e.g. text classification, question answering).
Intended uses & limitations
You can use the raw model for text infilling. However, the model is mostly meant to be fine-tuned on a supervised dataset. See the
model hub
to look for fine-tuned versions on a task that interests you.
How to use
Here is how to use this model in PyTorch:
from transformers import BartTokenizer, BartModel
tokenizer = BartTokenizer.from_pretrained('facebook/bart-large')
model = BartModel.from_pretrained('facebook/bart-large')
inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
outputs = model(**inputs)
last_hidden_states = outputs.last_hidden_state
BibTeX entry and citation info
@article{DBLP:journals/corr/abs-1910-13461,
author = {Mike Lewis and
Yinhan Liu and
Naman Goyal and
Marjan Ghazvininejad and
Abdelrahman Mohamed and
Omer Levy and
Veselin Stoyanov and
Luke Zettlemoyer},
title = {{BART:} Denoising Sequence-to-Sequence Pre-training for Natural Language
Generation, Translation, and Comprehension},
journal = {CoRR},
volume = {abs/1910.13461},
year = {2019},
url = {http://arxiv.org/abs/1910.13461},
eprinttype = {arXiv},
eprint = {1910.13461},
timestamp = {Thu, 31 Oct 2019 14:02:26 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1910-13461.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
Runs of facebook bart-large on huggingface.co
191.4K
Total runs
3.5K
24-hour runs
-3.7K
3-day runs
-21.0K
7-day runs
-40.8K
30-day runs
More Information About bart-large huggingface.co Model
bart-large huggingface.co is an AI model on huggingface.co that provides bart-large's model effect (), which can be used instantly with this facebook bart-large model. huggingface.co supports a free trial of the bart-large model, and also provides paid use of the bart-large. Support call bart-large model through api, including Node.js, Python, http.
bart-large huggingface.co is an online trial and call api platform, which integrates bart-large's modeling effects, including api services, and provides a free online trial of bart-large, you can try bart-large online for free by clicking the link below.
facebook bart-large online free url in huggingface.co:
bart-large is an open source model from GitHub that offers a free installation service, and any user can find bart-large on GitHub to install. At the same time, huggingface.co provides the effect of bart-large install, users can directly use bart-large installed effect in huggingface.co for debugging and trial. It also supports api for free installation.