microsoft / biogpt

huggingface.co
Total runs: 153.3K
24-hour runs: -4.3K
7-day runs: -2.9K
30-day runs: -38.0K
Model's Last Updated: February 03 2023
text-generation

Introduction of biogpt

Model Details of biogpt

BioGPT

Pre-trained language models have attracted increasing attention in the biomedical domain, inspired by their great success in the general natural language domain. Among the two main branches of pre-trained language models in the general language domain, i.e. BERT (and its variants) and GPT (and its variants), the first one has been extensively studied in the biomedical domain, such as BioBERT and PubMedBERT. While they have achieved great success on a variety of discriminative downstream biomedical tasks, the lack of generation ability constrains their application scope. In this paper, we propose BioGPT, a domain-specific generative Transformer language model pre-trained on large-scale biomedical literature. We evaluate BioGPT on six biomedical natural language processing tasks and demonstrate that our model outperforms previous models on most tasks. Especially, we get 44.98%, 38.42% and 40.76% F1 score on BC5CDR, KD-DTI and DDI end-to-end relation extraction tasks, respectively, and 78.2% accuracy on PubMedQA, creating a new record. Our case study on text generation further demonstrates the advantage of BioGPT on biomedical literature to generate fluent descriptions for biomedical terms.

You can use this model directly with a pipeline for text generation. Since the generation relies on some randomness, we set a seed for reproducibility:

>>> from transformers import pipeline, set_seed
>>> from transformers import BioGptTokenizer, BioGptForCausalLM
>>> model = BioGptForCausalLM.from_pretrained("microsoft/biogpt")
>>> tokenizer = BioGptTokenizer.from_pretrained("microsoft/biogpt")
>>> generator = pipeline('text-generation', model=model, tokenizer=tokenizer)
>>> set_seed(42)
>>> generator("COVID-19 is", max_length=20, num_return_sequences=5, do_sample=True)
[{'generated_text': 'COVID-19 is a disease that spreads worldwide and is currently found in a growing proportion of the population'},
 {'generated_text': 'COVID-19 is one of the largest viral epidemics in the world.'},
 {'generated_text': 'COVID-19 is a common condition affecting an estimated 1.1 million people in the United States alone.'},
 {'generated_text': 'COVID-19 is a pandemic, the incidence has been increased in a manner similar to that in other'},
 {'generated_text': 'COVID-19 is transmitted via droplets, air-borne, or airborne transmission.'}]

Here is how to use this model to get the features of a given text in PyTorch:

from transformers import BioGptTokenizer, BioGptForCausalLM
tokenizer = BioGptTokenizer.from_pretrained("microsoft/biogpt")
model = BioGptForCausalLM.from_pretrained("microsoft/biogpt")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)

Beam-search decoding:

import torch
from transformers import BioGptTokenizer, BioGptForCausalLM, set_seed

tokenizer = BioGptTokenizer.from_pretrained("microsoft/biogpt")
model = BioGptForCausalLM.from_pretrained("microsoft/biogpt")

sentence = "COVID-19 is"
inputs = tokenizer(sentence, return_tensors="pt")

set_seed(42)

with torch.no_grad():
    beam_output = model.generate(**inputs,
                                min_length=100,
                                max_length=1024,
                                num_beams=5,
                                early_stopping=True
                                )
tokenizer.decode(beam_output[0], skip_special_tokens=True)
'COVID-19 is a global pandemic caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), the causative agent of coronavirus disease 2019 (COVID-19), which has spread to more than 200 countries and territories, including the United States (US), Canada, Australia, New Zealand, the United Kingdom (UK), and the United States of America (USA), as of March 11, 2020, with more than 800,000 confirmed cases and more than 800,000 deaths.'
Citation

If you find BioGPT useful in your research, please cite the following paper:

@article{10.1093/bib/bbac409,
    author = {Luo, Renqian and Sun, Liai and Xia, Yingce and Qin, Tao and Zhang, Sheng and Poon, Hoifung and Liu, Tie-Yan},
    title = "{BioGPT: generative pre-trained transformer for biomedical text generation and mining}",
    journal = {Briefings in Bioinformatics},
    volume = {23},
    number = {6},
    year = {2022},
    month = {09},
    abstract = "{Pre-trained language models have attracted increasing attention in the biomedical domain, inspired by their great success in the general natural language domain. Among the two main branches of pre-trained language models in the general language domain, i.e. BERT (and its variants) and GPT (and its variants), the first one has been extensively studied in the biomedical domain, such as BioBERT and PubMedBERT. While they have achieved great success on a variety of discriminative downstream biomedical tasks, the lack of generation ability constrains their application scope. In this paper, we propose BioGPT, a domain-specific generative Transformer language model pre-trained on large-scale biomedical literature. We evaluate BioGPT on six biomedical natural language processing tasks and demonstrate that our model outperforms previous models on most tasks. Especially, we get 44.98\%, 38.42\% and 40.76\% F1 score on BC5CDR, KD-DTI and DDI end-to-end relation extraction tasks, respectively, and 78.2\% accuracy on PubMedQA, creating a new record. Our case study on text generation further demonstrates the advantage of BioGPT on biomedical literature to generate fluent descriptions for biomedical terms.}",
    issn = {1477-4054},
    doi = {10.1093/bib/bbac409},
    url = {https://doi.org/10.1093/bib/bbac409},
    note = {bbac409},
    eprint = {https://academic.oup.com/bib/article-pdf/23/6/bbac409/47144271/bbac409.pdf},
}

Runs of microsoft biogpt on huggingface.co

153.3K
Total runs
-4.3K
24-hour runs
-7.5K
3-day runs
-2.9K
7-day runs
-38.0K
30-day runs

More Information About biogpt huggingface.co Model

More biogpt license Visit here:

https://choosealicense.com/licenses/mit

biogpt huggingface.co

biogpt huggingface.co is an AI model on huggingface.co that provides biogpt's model effect (), which can be used instantly with this microsoft biogpt model. huggingface.co supports a free trial of the biogpt model, and also provides paid use of the biogpt. Support call biogpt model through api, including Node.js, Python, http.

microsoft biogpt online free

biogpt huggingface.co is an online trial and call api platform, which integrates biogpt's modeling effects, including api services, and provides a free online trial of biogpt, you can try biogpt online for free by clicking the link below.

microsoft biogpt online free url in huggingface.co:

https://huggingface.co/microsoft/biogpt

biogpt install

biogpt is an open source model from GitHub that offers a free installation service, and any user can find biogpt on GitHub to install. At the same time, huggingface.co provides the effect of biogpt install, users can directly use biogpt installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

biogpt install url in huggingface.co:

https://huggingface.co/microsoft/biogpt

Url of biogpt

Provider of biogpt huggingface.co

microsoft
ORGANIZATIONS

Other API from microsoft

huggingface.co

Total runs: 290.7M
Run Growth: 210.4M
Growth Rate: 71.77%
Updated: February 14 2024
huggingface.co

Total runs: 6.5M
Run Growth: 1.7M
Growth Rate: 25.67%
Updated: September 26 2022
huggingface.co

Total runs: 964.8K
Run Growth: 226.3K
Growth Rate: 23.45%
Updated: January 21 2025
huggingface.co

Total runs: 747.0K
Run Growth: 71.4K
Growth Rate: 9.57%
Updated: February 28 2023
huggingface.co

Total runs: 481.0K
Run Growth: -671.5K
Growth Rate: -139.61%
Updated: April 24 2023
huggingface.co

Total runs: 293.9K
Run Growth: 0
Growth Rate: 0.00%
Updated: January 09 2025
huggingface.co

Total runs: 291.1K
Run Growth: 47.9K
Growth Rate: 16.72%
Updated: February 03 2022
huggingface.co

Total runs: 229.8K
Run Growth: 51.8K
Growth Rate: 22.53%
Updated: April 30 2024
huggingface.co

Total runs: 138.0K
Run Growth: -84.2K
Growth Rate: -61.64%
Updated: November 08 2023
huggingface.co

Total runs: 112.2K
Run Growth: 40.1K
Growth Rate: 36.60%
Updated: April 08 2024
huggingface.co

Total runs: 110.5K
Run Growth: -30.0K
Growth Rate: -27.12%
Updated: April 30 2024
huggingface.co

Total runs: 84.2K
Run Growth: 60.1K
Growth Rate: 71.63%
Updated: December 23 2021
huggingface.co

Total runs: 65.1K
Run Growth: 902
Growth Rate: 100.00%
Updated: December 21 2024
huggingface.co

Total runs: 39.5K
Run Growth: 0
Growth Rate: 0.00%
Updated: January 09 2025
huggingface.co

Total runs: 19.1K
Run Growth: 5.1K
Growth Rate: 26.99%
Updated: February 29 2024
huggingface.co

Total runs: 16.5K
Run Growth: 2.6K
Growth Rate: 16.09%
Updated: June 27 2023
huggingface.co

Total runs: 14.7K
Run Growth: 3.7K
Growth Rate: 73.58%
Updated: October 22 2024
huggingface.co

Total runs: 13.4K
Run Growth: -308
Growth Rate: -2.30%
Updated: November 23 2023
huggingface.co

Total runs: 11.0K
Run Growth: 4.6K
Growth Rate: 41.86%
Updated: November 23 2023