Please use
BertForQuestionAnswering
to load this model!
This is a Chinese machine reading comprehension (MRC) model built on PERT-large and fine-tuned on a mixture of Chinese MRC datasets.
PERT is a pre-trained model based on permuted language model (PerLM) to learn text semantic information in a self-supervised manner without introducing the mask tokens [MASK]. It yields competitive results on in tasks such as reading comprehension and sequence labeling.
Results on Chinese MRC datasets (EM/F1):
(We report the checkpoint that has the best AVG score)
chinese-pert-large-mrc huggingface.co is an AI model on huggingface.co that provides chinese-pert-large-mrc's model effect (), which can be used instantly with this hfl chinese-pert-large-mrc model. huggingface.co supports a free trial of the chinese-pert-large-mrc model, and also provides paid use of the chinese-pert-large-mrc. Support call chinese-pert-large-mrc model through api, including Node.js, Python, http.
chinese-pert-large-mrc huggingface.co is an online trial and call api platform, which integrates chinese-pert-large-mrc's modeling effects, including api services, and provides a free online trial of chinese-pert-large-mrc, you can try chinese-pert-large-mrc online for free by clicking the link below.
hfl chinese-pert-large-mrc online free url in huggingface.co:
chinese-pert-large-mrc is an open source model from GitHub that offers a free installation service, and any user can find chinese-pert-large-mrc on GitHub to install. At the same time, huggingface.co provides the effect of chinese-pert-large-mrc install, users can directly use chinese-pert-large-mrc installed effect in huggingface.co for debugging and trial. It also supports api for free installation.
chinese-pert-large-mrc install url in huggingface.co: