The GTE models are trained by Alibaba DAMO Academy. They are mainly based on the BERT framework and currently offer three different sizes of models, including
GTE-large
,
GTE-base
, and
GTE-small
. The GTE models are trained on a large-scale corpus of relevance text pairs, covering a wide range of domains and scenarios. This enables the GTE models to be applied to various downstream tasks of text embeddings, including
information retrieval
,
semantic textual similarity
,
text reranking
, etc.
Metrics
Performance of GTE models were compared with other popular text embedding models on the MTEB benchmark. For more detailed comparison results, please refer to the
MTEB leaderboard
.
from sentence_transformers import SentenceTransformer
from sentence_transformers.util import cos_sim
sentences = ['That is a happy person', 'That is a very happy person']
model = SentenceTransformer('Supabase/gte-small')
embeddings = model.encode(sentences)
print(cos_sim(embeddings[0], embeddings[1]))
gte-small huggingface.co is an AI model on huggingface.co that provides gte-small's model effect (), which can be used instantly with this Supabase gte-small model. huggingface.co supports a free trial of the gte-small model, and also provides paid use of the gte-small. Support call gte-small model through api, including Node.js, Python, http.
gte-small huggingface.co is an online trial and call api platform, which integrates gte-small's modeling effects, including api services, and provides a free online trial of gte-small, you can try gte-small online for free by clicking the link below.
Supabase gte-small online free url in huggingface.co:
gte-small is an open source model from GitHub that offers a free installation service, and any user can find gte-small on GitHub to install. At the same time, huggingface.co provides the effect of gte-small install, users can directly use gte-small installed effect in huggingface.co for debugging and trial. It also supports api for free installation.