This is a
sentence-transformers
model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search. It has been trained over the SNLI, MNLI, SCINLI, SCITAIL, MEDNLI and STSB datasets for providing robust sentence embeddings.
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('pritamdeka/BioBERT-mnli-snli-scinli-scitail-mednli-stsb')
embeddings = model.encode(sentences)
print(embeddings)
Usage (HuggingFace Transformers)
Without
sentence-transformers
, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averagingdefmean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('pritamdeka/BioBERT-mnli-snli-scinli-scitail-mednli-stsb')
model = AutoModel.from_pretrained('pritamdeka/BioBERT-mnli-snli-scinli-scitail-mednli-stsb')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddingswith torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
Evaluation Results
For an automated evaluation of this model, see the
Sentence Embeddings Benchmark
:
https://seb.sbert.net
Training
The model was trained with the parameters:
DataLoader
:
torch.utils.data.dataloader.DataLoader
of length 90 with parameters:
If you use the model kindly cite the following work
@inproceedings{deka2022evidence,
title={Evidence Extraction to Validate Medical Claims in Fake News Detection},
author={Deka, Pritam and Jurek-Loughrey, Anna and others},
booktitle={International Conference on Health Information Science},
pages={3--15},
year={2022},
organization={Springer}
}
Runs of pritamdeka BioBERT-mnli-snli-scinli-scitail-mednli-stsb on huggingface.co
141.0K
Total runs
0
24-hour runs
47.2K
3-day runs
51.7K
7-day runs
80.0K
30-day runs
More Information About BioBERT-mnli-snli-scinli-scitail-mednli-stsb huggingface.co Model
More BioBERT-mnli-snli-scinli-scitail-mednli-stsb license Visit here:
BioBERT-mnli-snli-scinli-scitail-mednli-stsb huggingface.co is an AI model on huggingface.co that provides BioBERT-mnli-snli-scinli-scitail-mednli-stsb's model effect (), which can be used instantly with this pritamdeka BioBERT-mnli-snli-scinli-scitail-mednli-stsb model. huggingface.co supports a free trial of the BioBERT-mnli-snli-scinli-scitail-mednli-stsb model, and also provides paid use of the BioBERT-mnli-snli-scinli-scitail-mednli-stsb. Support call BioBERT-mnli-snli-scinli-scitail-mednli-stsb model through api, including Node.js, Python, http.
BioBERT-mnli-snli-scinli-scitail-mednli-stsb huggingface.co is an online trial and call api platform, which integrates BioBERT-mnli-snli-scinli-scitail-mednli-stsb's modeling effects, including api services, and provides a free online trial of BioBERT-mnli-snli-scinli-scitail-mednli-stsb, you can try BioBERT-mnli-snli-scinli-scitail-mednli-stsb online for free by clicking the link below.
pritamdeka BioBERT-mnli-snli-scinli-scitail-mednli-stsb online free url in huggingface.co:
BioBERT-mnli-snli-scinli-scitail-mednli-stsb is an open source model from GitHub that offers a free installation service, and any user can find BioBERT-mnli-snli-scinli-scitail-mednli-stsb on GitHub to install. At the same time, huggingface.co provides the effect of BioBERT-mnli-snli-scinli-scitail-mednli-stsb install, users can directly use BioBERT-mnli-snli-scinli-scitail-mednli-stsb installed effect in huggingface.co for debugging and trial. It also supports api for free installation.
BioBERT-mnli-snli-scinli-scitail-mednli-stsb install url in huggingface.co: