Note
: mT5 was only pre-trained on mC4 excluding any supervised training. Therefore, this model has to be fine-tuned before it is useable on a downstream task.
Authors:
Linting Xue, Noah Constant, Adam Roberts, Mihir Kale, Rami Al-Rfou, Aditya Siddhant, Aditya Barua, Colin Raffel
Abstract
The recent "Text-to-Text Transfer Transformer" (T5) leveraged a unified text-to-text format and scale to attain state-of-the-art results on a wide variety of English-language NLP tasks. In this paper, we introduce mT5, a multilingual variant of T5 that was pre-trained on a new Common Crawl-based dataset covering 101 languages. We describe the design and modified training of mT5 and demonstrate its state-of-the-art performance on many multilingual benchmarks. All of the code and model checkpoints used in this work are publicly available.
Runs of google mt5-small on huggingface.co
127.0K
Total runs
619
24-hour runs
-2.2K
3-day runs
5.1K
7-day runs
9.9K
30-day runs
More Information About mt5-small huggingface.co Model
mt5-small huggingface.co is an AI model on huggingface.co that provides mt5-small's model effect (), which can be used instantly with this google mt5-small model. huggingface.co supports a free trial of the mt5-small model, and also provides paid use of the mt5-small. Support call mt5-small model through api, including Node.js, Python, http.
mt5-small huggingface.co is an online trial and call api platform, which integrates mt5-small's modeling effects, including api services, and provides a free online trial of mt5-small, you can try mt5-small online for free by clicking the link below.
google mt5-small online free url in huggingface.co:
mt5-small is an open source model from GitHub that offers a free installation service, and any user can find mt5-small on GitHub to install. At the same time, huggingface.co provides the effect of mt5-small install, users can directly use mt5-small installed effect in huggingface.co for debugging and trial. It also supports api for free installation.