dmis-lab / TinySapBERT-from-TinyPubMedBERT-v1.0

huggingface.co
Total runs: 3.0K
24-hour runs: 20
7-day runs: 1.3K
30-day runs: -41.0K
Model's Last Updated: March 17 2023
feature-extraction

Introduction of TinySapBERT-from-TinyPubMedBERT-v1.0

Model Details of TinySapBERT-from-TinyPubMedBERT-v1.0

This model repository presents "TinySapBERT", tiny-sized biomedical entity representations (language model) trained using official SapBERT code and instructions (Liu et al., NAACL 2021) . We used our TinyPubMedBERT , a tiny-sized LM, as an initial starting point to train using the SapBERT scheme.
cf) TinyPubMedBERT is a distillated PubMedBERT (Gu et al., 2021) , open-sourced along with the release of the KAZU (Korea University and AstraZeneca) framework.

  • For details, please visit KAZU framework or see our paper entitled Biomedical NER for the Enterprise with Distillated BERN2 and the Kazu Framework , (EMNLP 2022 industry track).
  • For the demo of KAZU framework, please visit http://kazu.korea.ac.kr
Citation info

Joint-first authorship of Richard Jackson (AstraZeneca) and WonJin Yoon (Korea University).
Please cite the simplified version using the following section, or find the full citation information here

@inproceedings{YoonAndJackson2022BiomedicalNER,
  title="Biomedical {NER} for the Enterprise with Distillated {BERN}2 and the Kazu Framework",
  author="Yoon, Wonjin and Jackson, Richard and Ford, Elliot and Poroshin, Vladimir and Kang, Jaewoo",
  booktitle="Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Industry Track",
  month = dec,
  year = "2022",    
  address = "Abu Dhabi, UAE",
  publisher = "Association for Computational Linguistics",
  url = "https://aclanthology.org/2022.emnlp-industry.63",
  pages = "619--626",
}

The model used resources of SapBERT paper . We appreciate the authors for making the resources publicly available!

Liu, Fangyu, et al. "Self-Alignment Pretraining for Biomedical Entity Representations." 
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2021.
Contact Information

For help or issues using the codes or model (NER module of KAZU) in this repository, please contact WonJin Yoon (wonjin.info (at) gmail.com) or submit a GitHub issue.

Runs of dmis-lab TinySapBERT-from-TinyPubMedBERT-v1.0 on huggingface.co

3.0K
Total runs
20
24-hour runs
592
3-day runs
1.3K
7-day runs
-41.0K
30-day runs

More Information About TinySapBERT-from-TinyPubMedBERT-v1.0 huggingface.co Model

TinySapBERT-from-TinyPubMedBERT-v1.0 huggingface.co

TinySapBERT-from-TinyPubMedBERT-v1.0 huggingface.co is an AI model on huggingface.co that provides TinySapBERT-from-TinyPubMedBERT-v1.0's model effect (), which can be used instantly with this dmis-lab TinySapBERT-from-TinyPubMedBERT-v1.0 model. huggingface.co supports a free trial of the TinySapBERT-from-TinyPubMedBERT-v1.0 model, and also provides paid use of the TinySapBERT-from-TinyPubMedBERT-v1.0. Support call TinySapBERT-from-TinyPubMedBERT-v1.0 model through api, including Node.js, Python, http.

TinySapBERT-from-TinyPubMedBERT-v1.0 huggingface.co Url

https://huggingface.co/dmis-lab/TinySapBERT-from-TinyPubMedBERT-v1.0

dmis-lab TinySapBERT-from-TinyPubMedBERT-v1.0 online free

TinySapBERT-from-TinyPubMedBERT-v1.0 huggingface.co is an online trial and call api platform, which integrates TinySapBERT-from-TinyPubMedBERT-v1.0's modeling effects, including api services, and provides a free online trial of TinySapBERT-from-TinyPubMedBERT-v1.0, you can try TinySapBERT-from-TinyPubMedBERT-v1.0 online for free by clicking the link below.

dmis-lab TinySapBERT-from-TinyPubMedBERT-v1.0 online free url in huggingface.co:

https://huggingface.co/dmis-lab/TinySapBERT-from-TinyPubMedBERT-v1.0

TinySapBERT-from-TinyPubMedBERT-v1.0 install

TinySapBERT-from-TinyPubMedBERT-v1.0 is an open source model from GitHub that offers a free installation service, and any user can find TinySapBERT-from-TinyPubMedBERT-v1.0 on GitHub to install. At the same time, huggingface.co provides the effect of TinySapBERT-from-TinyPubMedBERT-v1.0 install, users can directly use TinySapBERT-from-TinyPubMedBERT-v1.0 installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

TinySapBERT-from-TinyPubMedBERT-v1.0 install url in huggingface.co:

https://huggingface.co/dmis-lab/TinySapBERT-from-TinyPubMedBERT-v1.0

Url of TinySapBERT-from-TinyPubMedBERT-v1.0

TinySapBERT-from-TinyPubMedBERT-v1.0 huggingface.co Url

Provider of TinySapBERT-from-TinyPubMedBERT-v1.0 huggingface.co

dmis-lab
ORGANIZATIONS

Other API from dmis-lab

huggingface.co

Total runs: 1.1K
Run Growth: 332
Growth Rate: 29.30%
Updated: October 27 2021