As described in
Longformer: The Long-Document Transformer
by Iz Beltagy, Matthew E. Peters, Arman Cohan,
led-base-16384
was initialized from
bart-base
since both models share the exact same architecture. To be able to process 16K tokens,
bart-base
's position embedding matrix was simply copied 16 times.
This model is especially interesting for long-range summarization and question answering.
Fine-tuning for down-stream task
This notebook
shows how
led-base-16384
can effectively be fine-tuned on a downstream task.
Runs of allenai led-base-16384 on huggingface.co
126.3K
Total runs
-129
24-hour runs
70
3-day runs
674
7-day runs
85.8K
30-day runs
More Information About led-base-16384 huggingface.co Model
led-base-16384 huggingface.co is an AI model on huggingface.co that provides led-base-16384's model effect (), which can be used instantly with this allenai led-base-16384 model. huggingface.co supports a free trial of the led-base-16384 model, and also provides paid use of the led-base-16384. Support call led-base-16384 model through api, including Node.js, Python, http.
led-base-16384 huggingface.co is an online trial and call api platform, which integrates led-base-16384's modeling effects, including api services, and provides a free online trial of led-base-16384, you can try led-base-16384 online for free by clicking the link below.
allenai led-base-16384 online free url in huggingface.co:
led-base-16384 is an open source model from GitHub that offers a free installation service, and any user can find led-base-16384 on GitHub to install. At the same time, huggingface.co provides the effect of led-base-16384 install, users can directly use led-base-16384 installed effect in huggingface.co for debugging and trial. It also supports api for free installation.