sshleifer / distill-pegasus-cnn-16-4

huggingface.co
Total runs: 456
24-hour runs: 0
7-day runs: 33
30-day runs: 268
Model's Last Updated: October 08 2020
summarization

Introduction of distill-pegasus-cnn-16-4

Model Details of distill-pegasus-cnn-16-4

Pegasus Models

See Docs: here

Original TF 1 code here

Authors: Jingqing Zhang, Yao Zhao, Mohammad Saleh and Peter J. Liu on Dec 18, 2019

Maintained by: @sshleifer

Task: Summarization

The following is copied from the authors' README.

Mixed & Stochastic Checkpoints

We train a pegasus model with sampled gap sentence ratios on both C4 and HugeNews, and stochastically sample important sentences. The updated the results are reported in this table.

dataset C4 HugeNews Mixed & Stochastic
xsum 45.20/22.06/36.99 47.21/24.56/39.25 47.60/24.83/39.64
cnn_dailymail 43.90/21.20/40.76 44.17/21.47/41.11 44.16/21.56/41.30
newsroom 45.07/33.39/41.28 45.15/33.51/41.33 45.98/34.20/42.18
multi_news 46.74/17.95/24.26 47.52/18.72/24.91 47.65/18.75/24.95
gigaword 38.75/19.96/36.14 39.12/19.86/36.24 39.65/20.47/36.76
wikihow 43.07/19.70/34.79 41.35/18.51/33.42 46.39/22.12/38.41 *
reddit_tifu 26.54/8.94/21.64 26.63/9.01/21.60 27.99/9.81/22.94
big_patent 53.63/33.16/42.25 53.41/32.89/42.07 52.29/33.08/41.66 *
arxiv 44.70/17.27/25.80 44.67/17.18/25.73 44.21/16.95/25.67
pubmed 45.49/19.90/27.69 45.09/19.56/27.42 45.97/20.15/28.25
aeslc 37.69/21.85/36.84 37.40/21.22/36.45 37.68/21.25/36.51
billsum 57.20/39.56/45.80 57.31/40.19/45.82 59.67/41.58/47.59

The "Mixed & Stochastic" model has the following changes:

  • trained on both C4 and HugeNews (dataset mixture is weighted by their number of examples).
  • trained for 1.5M instead of 500k (we observe slower convergence on pretraining perplexity).
  • the model uniformly sample a gap sentence ratio between 15% and 45%.
  • importance sentences are sampled using a 20% uniform noise to importance scores.
  • the sentencepiece tokenizer is updated to be able to encode newline character.

(*) the numbers of wikihow and big_patent datasets are not comparable because of change in tokenization and data:

  • wikihow dataset contains newline characters which is useful for paragraph segmentation, the C4 and HugeNews model's sentencepiece tokenizer doesn't encode newline and loose this information.
  • we update the BigPatent dataset to preserve casing, some format cleanings are also changed, please refer to change in TFDS.

The "Mixed & Stochastic" model has the following changes (from pegasus-large in the paper):

trained on both C4 and HugeNews (dataset mixture is weighted by their number of examples). trained for 1.5M instead of 500k (we observe slower convergence on pretraining perplexity). the model uniformly sample a gap sentence ratio between 15% and 45%. importance sentences are sampled using a 20% uniform noise to importance scores. the sentencepiece tokenizer is updated to be able to encode newline character.

Citation



@misc{zhang2019pegasus,
    title={PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization},
    author={Jingqing Zhang and Yao Zhao and Mohammad Saleh and Peter J. Liu},
    year={2019},
    eprint={1912.08777},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}

Runs of sshleifer distill-pegasus-cnn-16-4 on huggingface.co

456
Total runs
0
24-hour runs
13
3-day runs
33
7-day runs
268
30-day runs

More Information About distill-pegasus-cnn-16-4 huggingface.co Model

distill-pegasus-cnn-16-4 huggingface.co

distill-pegasus-cnn-16-4 huggingface.co is an AI model on huggingface.co that provides distill-pegasus-cnn-16-4's model effect (), which can be used instantly with this sshleifer distill-pegasus-cnn-16-4 model. huggingface.co supports a free trial of the distill-pegasus-cnn-16-4 model, and also provides paid use of the distill-pegasus-cnn-16-4. Support call distill-pegasus-cnn-16-4 model through api, including Node.js, Python, http.

distill-pegasus-cnn-16-4 huggingface.co Url

https://huggingface.co/sshleifer/distill-pegasus-cnn-16-4

sshleifer distill-pegasus-cnn-16-4 online free

distill-pegasus-cnn-16-4 huggingface.co is an online trial and call api platform, which integrates distill-pegasus-cnn-16-4's modeling effects, including api services, and provides a free online trial of distill-pegasus-cnn-16-4, you can try distill-pegasus-cnn-16-4 online for free by clicking the link below.

sshleifer distill-pegasus-cnn-16-4 online free url in huggingface.co:

https://huggingface.co/sshleifer/distill-pegasus-cnn-16-4

distill-pegasus-cnn-16-4 install

distill-pegasus-cnn-16-4 is an open source model from GitHub that offers a free installation service, and any user can find distill-pegasus-cnn-16-4 on GitHub to install. At the same time, huggingface.co provides the effect of distill-pegasus-cnn-16-4 install, users can directly use distill-pegasus-cnn-16-4 installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

distill-pegasus-cnn-16-4 install url in huggingface.co:

https://huggingface.co/sshleifer/distill-pegasus-cnn-16-4

Url of distill-pegasus-cnn-16-4

distill-pegasus-cnn-16-4 huggingface.co Url

Provider of distill-pegasus-cnn-16-4 huggingface.co

sshleifer
ORGANIZATIONS

Other API from sshleifer

huggingface.co

Total runs: 39.6K
Run Growth: 4.5K
Growth Rate: 11.24%
Updated: May 23 2021
huggingface.co

Total runs: 39.2K
Run Growth: 27.1K
Growth Rate: 69.87%
Updated: August 26 2021
huggingface.co

Total runs: 9.6K
Run Growth: 2.1K
Growth Rate: 21.76%
Updated: May 13 2020
huggingface.co

Total runs: 3.1K
Run Growth: -777
Growth Rate: -23.42%
Updated: June 14 2021
huggingface.co

Total runs: 4
Run Growth: 0
Growth Rate: 0.00%
Updated: September 25 2020
huggingface.co

Total runs: 3
Run Growth: -1
Growth Rate: -33.33%
Updated: September 19 2020