deepset / gbert-base-germandpr-ctx_encoder

huggingface.co
Total runs: 51
24-hour runs: 0
7-day runs: 17
30-day runs: 34
Model's Last Updated: September 26 2024

Introduction of gbert-base-germandpr-ctx_encoder

Model Details of gbert-base-germandpr-ctx_encoder

bert_image

Overview

Language model: gbert-base-germandpr
Language: German
Training data: GermanDPR train set (~ 56MB)
Eval data: GermanDPR test set (~ 6MB)
Infrastructure : 4x V100 GPU
Published : Apr 26th, 2021

Details
  • We trained a dense passage retrieval model with two gbert-base models as encoders of questions and passages.
  • The dataset is GermanDPR, a new, German language dataset, which we hand-annotated and published online .
  • It comprises 9275 question/answer pairs in the training set and 1025 pairs in the test set. For each pair, there are one positive context and three hard negative contexts.
  • As the basis of the training data, we used our hand-annotated GermanQuAD dataset as positive samples and generated hard negative samples from the latest German Wikipedia dump (6GB of raw txt files).
  • The data dump was cleaned with tailored scripts, leading to 2.8 million indexed passages from German Wikipedia.

See https://deepset.ai/germanquad for more details and dataset download.

Hyperparameters
batch_size = 40
n_epochs = 20
num_training_steps = 4640
num_warmup_steps = 460
max_seq_len = 32 tokens for question encoder and 300 tokens for passage encoder
learning_rate = 1e-6
lr_schedule = LinearWarmup
embeds_dropout_prob = 0.1
num_hard_negatives = 2
Performance

During training, we monitored the in-batch average rank and the loss and evaluated different batch sizes, numbers of epochs, and number of hard negatives on a dev set split from the train set. The dev split contained 1030 question/answer pairs. Even without thorough hyperparameter tuning, we observed quite stable learning. Multiple restarts with different seeds produced quite similar results. Note that the in-batch average rank is influenced by settings for batch size and number of hard negatives. A smaller number of hard negatives makes the task easier. After fixing the hyperparameters we trained the model on the full GermanDPR train set.

We further evaluated the retrieval performance of the trained model on the full German Wikipedia with the GermanDPR test set as labels. To this end, we converted the GermanDPR test set to SQuAD format. The DPR model drastically outperforms the BM25 baseline with regard to recall@k. performancetable

Usage
In haystack

You can load the model in haystack as a retriever for doing QA at scale:

retriever = DensePassageRetriever(
  document_store=document_store,
  query_embedding_model="deepset/gbert-base-germandpr-question_encoder"
  passage_embedding_model="deepset/gbert-base-germandpr-ctx_encoder"
)
Authors
  • Timo Möller: timo.moeller [at] deepset.ai
  • Julian Risch: julian.risch [at] deepset.ai
  • Malte Pietsch: malte.pietsch [at] deepset.ai
About us

deepset logo We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.

Some of our work:

Get in touch: Twitter | LinkedIn | Website

By the way: we're hiring!

Runs of deepset gbert-base-germandpr-ctx_encoder on huggingface.co

51
Total runs
0
24-hour runs
14
3-day runs
17
7-day runs
34
30-day runs

More Information About gbert-base-germandpr-ctx_encoder huggingface.co Model

More gbert-base-germandpr-ctx_encoder license Visit here:

https://choosealicense.com/licenses/mit

gbert-base-germandpr-ctx_encoder huggingface.co

gbert-base-germandpr-ctx_encoder huggingface.co is an AI model on huggingface.co that provides gbert-base-germandpr-ctx_encoder's model effect (), which can be used instantly with this deepset gbert-base-germandpr-ctx_encoder model. huggingface.co supports a free trial of the gbert-base-germandpr-ctx_encoder model, and also provides paid use of the gbert-base-germandpr-ctx_encoder. Support call gbert-base-germandpr-ctx_encoder model through api, including Node.js, Python, http.

gbert-base-germandpr-ctx_encoder huggingface.co Url

https://huggingface.co/deepset/gbert-base-germandpr-ctx_encoder

deepset gbert-base-germandpr-ctx_encoder online free

gbert-base-germandpr-ctx_encoder huggingface.co is an online trial and call api platform, which integrates gbert-base-germandpr-ctx_encoder's modeling effects, including api services, and provides a free online trial of gbert-base-germandpr-ctx_encoder, you can try gbert-base-germandpr-ctx_encoder online for free by clicking the link below.

deepset gbert-base-germandpr-ctx_encoder online free url in huggingface.co:

https://huggingface.co/deepset/gbert-base-germandpr-ctx_encoder

gbert-base-germandpr-ctx_encoder install

gbert-base-germandpr-ctx_encoder is an open source model from GitHub that offers a free installation service, and any user can find gbert-base-germandpr-ctx_encoder on GitHub to install. At the same time, huggingface.co provides the effect of gbert-base-germandpr-ctx_encoder install, users can directly use gbert-base-germandpr-ctx_encoder installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

gbert-base-germandpr-ctx_encoder install url in huggingface.co:

https://huggingface.co/deepset/gbert-base-germandpr-ctx_encoder

Url of gbert-base-germandpr-ctx_encoder

gbert-base-germandpr-ctx_encoder huggingface.co Url

Provider of gbert-base-germandpr-ctx_encoder huggingface.co

deepset
ORGANIZATIONS

Other API from deepset

huggingface.co

Total runs: 50.7K
Run Growth: 35.7K
Growth Rate: 71.98%
Updated: September 26 2024
huggingface.co

Total runs: 27.6K
Run Growth: 4.2K
Growth Rate: 15.21%
Updated: September 26 2024
huggingface.co

Total runs: 418
Run Growth: -28
Growth Rate: -6.78%
Updated: September 26 2024
huggingface.co

Total runs: 405
Run Growth: -1.3K
Growth Rate: -349.61%
Updated: September 26 2024