cointegrated / rubert-tiny

huggingface.co
Total runs: 7.4K
24-hour runs: 0
7-day runs: 275
30-day runs: -31.5K
Model's Last Updated: February 10 2024
fill-mask

Introduction of rubert-tiny

Model Details of rubert-tiny

This is a very small distilled version of the bert-base-multilingual-cased model for Russian and English (45 MB, 12M parameters). There is also an updated version of this model , rubert-tiny2 , with a larger vocabulary and better quality on practically all Russian NLU tasks.

This model is useful if you want to fine-tune it for a relatively simple Russian task (e.g. NER or sentiment classification), and you care more about speed and size than about accuracy. It is approximately x10 smaller and faster than a base-sized BERT. Its [CLS] embeddings can be used as a sentence representation aligned between Russian and English.

It was trained on the Yandex Translate corpus , OPUS-100 and Tatoeba , using MLM loss (distilled from bert-base-multilingual-cased ), translation ranking loss, and [CLS] embeddings distilled from LaBSE , rubert-base-cased-sentence , Laser and USE.

There is a more detailed description in Russian .

Sentence embeddings can be produced as follows:

# pip install transformers sentencepiece
import torch
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("cointegrated/rubert-tiny")
model = AutoModel.from_pretrained("cointegrated/rubert-tiny")
# model.cuda()  # uncomment it if you have a GPU

def embed_bert_cls(text, model, tokenizer):
    t = tokenizer(text, padding=True, truncation=True, return_tensors='pt')
    with torch.no_grad():
        model_output = model(**{k: v.to(model.device) for k, v in t.items()})
    embeddings = model_output.last_hidden_state[:, 0, :]
    embeddings = torch.nn.functional.normalize(embeddings)
    return embeddings[0].cpu().numpy()

print(embed_bert_cls('привет мир', model, tokenizer).shape)
# (312,)

Runs of cointegrated rubert-tiny on huggingface.co

7.4K
Total runs
0
24-hour runs
207
3-day runs
275
7-day runs
-31.5K
30-day runs

More Information About rubert-tiny huggingface.co Model

More rubert-tiny license Visit here:

https://choosealicense.com/licenses/mit

rubert-tiny huggingface.co

rubert-tiny huggingface.co is an AI model on huggingface.co that provides rubert-tiny's model effect (), which can be used instantly with this cointegrated rubert-tiny model. huggingface.co supports a free trial of the rubert-tiny model, and also provides paid use of the rubert-tiny. Support call rubert-tiny model through api, including Node.js, Python, http.

cointegrated rubert-tiny online free

rubert-tiny huggingface.co is an online trial and call api platform, which integrates rubert-tiny's modeling effects, including api services, and provides a free online trial of rubert-tiny, you can try rubert-tiny online for free by clicking the link below.

cointegrated rubert-tiny online free url in huggingface.co:

https://huggingface.co/cointegrated/rubert-tiny

rubert-tiny install

rubert-tiny is an open source model from GitHub that offers a free installation service, and any user can find rubert-tiny on GitHub to install. At the same time, huggingface.co provides the effect of rubert-tiny install, users can directly use rubert-tiny installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

rubert-tiny install url in huggingface.co:

https://huggingface.co/cointegrated/rubert-tiny

Url of rubert-tiny

Provider of rubert-tiny huggingface.co

cointegrated
ORGANIZATIONS

Other API from cointegrated