This model was trained on the
Quora Duplicate Questions
dataset. The model will predict a score between 0 and 1 how likely the two given questions are duplicates.
Note: The model is not suitable to estimate the similarity of questions, e.g. the two questions "How to learn Java" and "How to learn Python" will result in a rahter low score, as these are not duplicates.
Usage and Performance
Pre-trained models can be used like this:
from sentence_transformers import CrossEncoder
model = CrossEncoder('model_name')
scores = model.predict([('Question 1', 'Question 2'), ('Question 3', 'Question 4')])
You can use this model also without sentence_transformers and by just using Transformers
AutoModel
class
Runs of cross-encoder quora-roberta-base on huggingface.co
7.0K
Total runs
0
24-hour runs
-68
3-day runs
212
7-day runs
1.4K
30-day runs
More Information About quora-roberta-base huggingface.co Model
quora-roberta-base huggingface.co is an AI model on huggingface.co that provides quora-roberta-base's model effect (), which can be used instantly with this cross-encoder quora-roberta-base model. huggingface.co supports a free trial of the quora-roberta-base model, and also provides paid use of the quora-roberta-base. Support call quora-roberta-base model through api, including Node.js, Python, http.
quora-roberta-base huggingface.co is an online trial and call api platform, which integrates quora-roberta-base's modeling effects, including api services, and provides a free online trial of quora-roberta-base, you can try quora-roberta-base online for free by clicking the link below.
cross-encoder quora-roberta-base online free url in huggingface.co:
quora-roberta-base is an open source model from GitHub that offers a free installation service, and any user can find quora-roberta-base on GitHub to install. At the same time, huggingface.co provides the effect of quora-roberta-base install, users can directly use quora-roberta-base installed effect in huggingface.co for debugging and trial. It also supports api for free installation.