Language model:
deepset/roberta-base-squad2-distilled
Language:
English
Training data:
SQuAD 2.0 training set
Eval data:
SQuAD 2.0 dev set
Infrastructure
: 1x V100 GPU
Published
: Apr 21st, 2021
Details
haystack's distillation feature was used for training. deepset/bert-large-uncased-whole-word-masking-squad2 was used as the teacher model.
bert-medium-squad2-distilled huggingface.co is an AI model on huggingface.co that provides bert-medium-squad2-distilled's model effect (), which can be used instantly with this deepset bert-medium-squad2-distilled model. huggingface.co supports a free trial of the bert-medium-squad2-distilled model, and also provides paid use of the bert-medium-squad2-distilled. Support call bert-medium-squad2-distilled model through api, including Node.js, Python, http.
bert-medium-squad2-distilled huggingface.co is an online trial and call api platform, which integrates bert-medium-squad2-distilled's modeling effects, including api services, and provides a free online trial of bert-medium-squad2-distilled, you can try bert-medium-squad2-distilled online for free by clicking the link below.
deepset bert-medium-squad2-distilled online free url in huggingface.co:
bert-medium-squad2-distilled is an open source model from GitHub that offers a free installation service, and any user can find bert-medium-squad2-distilled on GitHub to install. At the same time, huggingface.co provides the effect of bert-medium-squad2-distilled install, users can directly use bert-medium-squad2-distilled installed effect in huggingface.co for debugging and trial. It also supports api for free installation.
bert-medium-squad2-distilled install url in huggingface.co: