ai-forever / sage-mt5-large

huggingface.co
Total runs: 265
24-hour runs: 0
7-day runs: 21
30-day runs: 201
Model's Last Updated: 2024年4月4日
text2text-generation

Introduction of sage-mt5-large

Model Details of sage-mt5-large

sage-mt5-large

banner

Summary

The model corrects spelling errors and typos in both Russian and English languages by bringing all the words in the text to the norm of the language. Corrector had been trained based on the model mT5-large architecture. An extensive dataset with “artificial” errors was taken as a training corpus: the corpus was assembled on the basis of the Russian-language Wikipedia and transcripts of Russian-language videos, then typos and spelling errors were automatically introduced into it using the library SAGE .

Public references
Examples
Input Output
Перведи мне текст на аглиском: "Screw you kuys, I am goin hme (c). Переведи мне текст на английском: "Screw you guys, I am going home" (c).
И не чсно прохожим в этот день непогожйи почему я веселый такйо И мне ясно прохожим в этот день непогожий, почему я веселый такой
If you bought something goregous, you well be very happy. If you bought something gorgeous, you will be very happy.
Metrics
Quality

Below are automatic metrics for determining the correctness of the spell checkers. We compare our solution with both open automatic spell checkers and the ChatGPT family of models on all six available datasets:

  • RUSpellRU : texts collected from ( LiveJournal ), with manually corrected typos and errors;
  • MultidomainGold : examples from 7 text sources, including the open web, news, social media, reviews, subtitles, policy documents and literary works;
  • MedSpellChecker : texts with errors from medical anamnesis;
  • GitHubTypoCorpusRu : spelling errors and typos in commits from GitHub ;
  • BEA60K : English spelling errors collected from several domains;
  • JFLEG : 1601 sentences in English, which contain about 2 thousand spelling errors;

RUSpellRU, MultidomainGold, MedSpellChecker, GitHubTypoCorpusRu are datasets for the Russian spellchecking and BEA60K and JFLEG are those for the English language.

RUSpellRU

Model Precision Recall F1
sage-mt5-large 55.7 68.5 61.4
sage-mt5-large (ft.) 88.4 71.6 79.1
sage-ai-service 93.5 82.4 87.6
gpt-3.5-turbo 39.6 62.3 48.5
gpt-4 69.5 81.0 74.8

MultidomainGold

Model Precision Recall F1
sage-mt5-large 35.4 57.9 43.9
sage-mt5-large (ft.) 65.3 62.7 63.9
sage-ai-service 70.9 68.8 69.9
gpt-3.5-turbo 17.8 56.1 27.0
gpt-4 31.1 78.1 44.5

MedSpellChecker

Model Precision Recall F1
sage-mt5-large 35.1 70.8 47.0
sage-mt5-large (ft.) 77.7 77.5 77.6
sage-ai-service 73.4 76.2 74.9
gpt-3.5-turbo 15.1 53.6 23.5
gpt-4 48.9 88.7 63.1

GitHubTypoCorpusRu

Model Precision Recall F1
sage-mt5-large 47.4 53.8 50.4
sage-mt5-large (ft.) 69.5 46.0 55.3
sage-ai-service 76.1 51.2 61.2
gpt-3.5-turbo 23.7 43.9 30.8
gpt-4 34.7 60.5 44.1

BEA60K

Model Precision Recall F1
sage-mt5-large 64.7 83.8 73.0
gpt-3.5-turbo 66.9 84.1 74.5
gpt-4 68.6 85.2 76.0
Bert ( https://github.com/neuspell/neuspell ) 65.8 79.6 72.0
SC-LSTM ( https://github.com/neuspell/neuspell ) 62.2 80.3 72.0

JFLEG

Model Precision Recall F1
sage-mt5-large 74.9 88.4 81.1
gpt-3.5-turbo 77.8 88.6 82.9
gpt-4 77.9 88.3 82.8
Bert ( https://github.com/neuspell/neuspell ) 78.5 85.4 81.8
SC-LSTM ( https://github.com/neuspell/neuspell ) 80.6 86.1 83.2
How to use
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("ai-forever/sage-mt5-large")
model = AutoModelForSeq2SeqLM.from_pretrained("ai-forever/sage-mt5-large", device_map='cuda')

sentence = "Перведи мне текст на аглиском: \"Screw you kuys, I am goin hme (c)."
inputs = tokenizer(sentence, max_length=None, padding="longest", truncation=False, return_tensors="pt")
outputs = model.generate(**inputs.to(model.device), max_length = inputs["input_ids"].size(1) * 1.5)
print(tokenizer.batch_decode(outputs, skip_special_tokens=True))

# ["Переведи мне текст на английском: "Screw you guys, I am going home" (c)."]
Limitations
  • For the Russian language the model is intended to be fine-tuned for better performance.
Resources
License

Model mT5-large , on the basis of which our solution is made, and its source code are supplied under the Apache-2.0 license. Our solution comes with MIT license.

Specifications
  • File size: 5 Gb;
  • Framework: pytorch
  • Version: v1.0
  • Developer: SberDevices, AGI NLP
Contacts

nikita.martynov.98@list.ru

Runs of ai-forever sage-mt5-large on huggingface.co

265
Total runs
0
24-hour runs
18
3-day runs
21
7-day runs
201
30-day runs

More Information About sage-mt5-large huggingface.co Model

More sage-mt5-large license Visit here:

https://choosealicense.com/licenses/mit

sage-mt5-large huggingface.co

sage-mt5-large huggingface.co is an AI model on huggingface.co that provides sage-mt5-large's model effect (), which can be used instantly with this ai-forever sage-mt5-large model. huggingface.co supports a free trial of the sage-mt5-large model, and also provides paid use of the sage-mt5-large. Support call sage-mt5-large model through api, including Node.js, Python, http.

ai-forever sage-mt5-large online free

sage-mt5-large huggingface.co is an online trial and call api platform, which integrates sage-mt5-large's modeling effects, including api services, and provides a free online trial of sage-mt5-large, you can try sage-mt5-large online for free by clicking the link below.

ai-forever sage-mt5-large online free url in huggingface.co:

https://huggingface.co/ai-forever/sage-mt5-large

sage-mt5-large install

sage-mt5-large is an open source model from GitHub that offers a free installation service, and any user can find sage-mt5-large on GitHub to install. At the same time, huggingface.co provides the effect of sage-mt5-large install, users can directly use sage-mt5-large installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

sage-mt5-large install url in huggingface.co:

https://huggingface.co/ai-forever/sage-mt5-large

Url of sage-mt5-large

sage-mt5-large huggingface.co Url

Provider of sage-mt5-large huggingface.co

ai-forever
ORGANIZATIONS

Other API from ai-forever

huggingface.co

Total runs: 525.5K
Run Growth: 507.4K
Growth Rate: 96.56%
Updated: 2023年11月3日
huggingface.co

Total runs: 10.6K
Run Growth: 1.5K
Growth Rate: 13.75%
Updated: 2023年12月5日
huggingface.co

Total runs: 8.2K
Run Growth: 5.1K
Growth Rate: 59.45%
Updated: 2024年12月29日
huggingface.co

Total runs: 5.9K
Run Growth: 3.5K
Growth Rate: 60.26%
Updated: 2023年12月11日
huggingface.co

Total runs: 2.3K
Run Growth: -408
Growth Rate: -17.86%
Updated: 2023年12月5日
huggingface.co

Total runs: 1.8K
Run Growth: 5
Growth Rate: 0.28%
Updated: 2023年12月28日
huggingface.co

Total runs: 1.3K
Run Growth: -6.3K
Growth Rate: -493.39%
Updated: 2023年12月5日
huggingface.co

Total runs: 315
Run Growth: 158
Growth Rate: 50.16%
Updated: 2023年1月26日
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated: 2021年12月24日
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated: 2023年6月8日
huggingface.co

Total runs: 0
Run Growth: 0
Growth Rate: 0.00%
Updated: 2021年9月21日