As described in
Longformer: The Long-Document Transformer
by Iz Beltagy, Matthew E. Peters, Arman Cohan,
led-large-16384
was initialized from
bart-large
since both models share the exact same architecture. To be able to process 16K tokens,
bart-large
's position embedding matrix was simply copied 16 times.
This model is especially interesting for long-range summarization and question answering.
Fine-tuning for down-stream task
This notebook
shows how
led-large-16384
can effectively be fine-tuned on a downstream task.
Runs of allenai led-large-16384 on huggingface.co
1.6K
Total runs
22
24-hour runs
34
3-day runs
-24
7-day runs
461
30-day runs
More Information About led-large-16384 huggingface.co Model
led-large-16384 huggingface.co is an AI model on huggingface.co that provides led-large-16384's model effect (), which can be used instantly with this allenai led-large-16384 model. huggingface.co supports a free trial of the led-large-16384 model, and also provides paid use of the led-large-16384. Support call led-large-16384 model through api, including Node.js, Python, http.
led-large-16384 huggingface.co is an online trial and call api platform, which integrates led-large-16384's modeling effects, including api services, and provides a free online trial of led-large-16384, you can try led-large-16384 online for free by clicking the link below.
allenai led-large-16384 online free url in huggingface.co:
led-large-16384 is an open source model from GitHub that offers a free installation service, and any user can find led-large-16384 on GitHub to install. At the same time, huggingface.co provides the effect of led-large-16384 install, users can directly use led-large-16384 installed effect in huggingface.co for debugging and trial. It also supports api for free installation.