deepset / bert-medium-squad2-distilled

huggingface.co
Total runs: 618
24-hour runs: 0
7-day runs: 41
30-day runs: -1.4K
Model's Last Updated: Tháng 9 24 2024
question-answering

Introduction of bert-medium-squad2-distilled

Model Details of bert-medium-squad2-distilled

Overview

Language model: deepset/roberta-base-squad2-distilled
Language: English
Training data: SQuAD 2.0 training set
Eval data: SQuAD 2.0 dev set
Infrastructure : 1x V100 GPU
Published : Apr 21st, 2021

Details
  • haystack's distillation feature was used for training. deepset/bert-large-uncased-whole-word-masking-squad2 was used as the teacher model.
Hyperparameters
batch_size = 6
n_epochs = 2
max_seq_len = 384
learning_rate = 3e-5
lr_schedule = LinearWarmup
embeds_dropout_prob = 0.1
temperature = 5
distillation_loss_weight = 1
Performance
"exact": 68.6431398972458
"f1": 72.7637083790805
Authors
  • Timo Möller: timo.moeller [at] deepset.ai
  • Julian Risch: julian.risch [at] deepset.ai
  • Malte Pietsch: malte.pietsch [at] deepset.ai
  • Michel Bartels: michel.bartels [at] deepset.ai
About us

deepset logo We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.

Some of our work:

Get in touch: Twitter | LinkedIn | Discord | GitHub Discussions | Website

By the way: we're hiring!

Runs of deepset bert-medium-squad2-distilled on huggingface.co

618
Total runs
0
24-hour runs
16
3-day runs
41
7-day runs
-1.4K
30-day runs

More Information About bert-medium-squad2-distilled huggingface.co Model

More bert-medium-squad2-distilled license Visit here:

https://choosealicense.com/licenses/mit

bert-medium-squad2-distilled huggingface.co

bert-medium-squad2-distilled huggingface.co is an AI model on huggingface.co that provides bert-medium-squad2-distilled's model effect (), which can be used instantly with this deepset bert-medium-squad2-distilled model. huggingface.co supports a free trial of the bert-medium-squad2-distilled model, and also provides paid use of the bert-medium-squad2-distilled. Support call bert-medium-squad2-distilled model through api, including Node.js, Python, http.

bert-medium-squad2-distilled huggingface.co Url

https://huggingface.co/deepset/bert-medium-squad2-distilled

deepset bert-medium-squad2-distilled online free

bert-medium-squad2-distilled huggingface.co is an online trial and call api platform, which integrates bert-medium-squad2-distilled's modeling effects, including api services, and provides a free online trial of bert-medium-squad2-distilled, you can try bert-medium-squad2-distilled online for free by clicking the link below.

deepset bert-medium-squad2-distilled online free url in huggingface.co:

https://huggingface.co/deepset/bert-medium-squad2-distilled

bert-medium-squad2-distilled install

bert-medium-squad2-distilled is an open source model from GitHub that offers a free installation service, and any user can find bert-medium-squad2-distilled on GitHub to install. At the same time, huggingface.co provides the effect of bert-medium-squad2-distilled install, users can directly use bert-medium-squad2-distilled installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

bert-medium-squad2-distilled install url in huggingface.co:

https://huggingface.co/deepset/bert-medium-squad2-distilled

Url of bert-medium-squad2-distilled

bert-medium-squad2-distilled huggingface.co Url

Provider of bert-medium-squad2-distilled huggingface.co

deepset
ORGANIZATIONS

Other API from deepset

huggingface.co

Total runs: 50.8K
Run Growth: 37.0K
Growth Rate: 72.79%
Updated: Tháng 9 26 2024
huggingface.co

Total runs: 27.7K
Run Growth: 4.1K
Growth Rate: 14.76%
Updated: Tháng 9 26 2024
huggingface.co

Total runs: 11.7K
Run Growth: 6.2K
Growth Rate: 53.20%
Updated: Có thể 19 2021
huggingface.co

Total runs: 408
Run Growth: -1.0K
Growth Rate: -251.72%
Updated: Tháng 9 26 2024
huggingface.co

Total runs: 400
Run Growth: -24
Growth Rate: -6.00%
Updated: Tháng 9 26 2024