from hf_hub_ctranslate2 import MultiLingualTranslatorCT2fromHfHub
model = MultiLingualTranslatorCT2fromHfHub(
model_name_or_path="michaelfeil/ct2fast-m2m100_12B-last-ckpt", device="cpu", compute_type="int8",
tokenizer=AutoTokenizer.from_pretrained(f"facebook/m2m100_418M")
)
outputs = model.generate(
["How do you call a fast Flamingo?", "Wie geht es dir?"],
src_lang=["en", "de"],
tgt_lang=["de", "fr"]
)
compute_type=int8_float16 for device="cuda"
compute_type=int8 for device="cpu"
M2M100 is a multilingual encoder-decoder (seq-to-seq) model trained for Many-to-Many multilingual translation.
It was introduced in this
paper
and first released in
this
repository.
The model that can directly translate between the 9,900 directions of 100 languages.
To translate into a target language, the target language id is forced as the first generated token.
To force the target language id as the first generated token, pass the
forced_bos_token_id
parameter to the
generate
method.
Note:
M2M100Tokenizer
depends on
sentencepiece
, so make sure to install it before running the example.
To install
sentencepiece
run
pip install sentencepiece
from transformers import M2M100ForConditionalGeneration, M2M100Tokenizer
hi_text = "जीवन एक चॉकलेट बॉक्स की तरह है।"
chinese_text = "生活就像一盒巧克力。"
model = M2M100ForConditionalGeneration.from_pretrained("facebook/m2m100-12B-last-ckpt")
tokenizer = M2M100Tokenizer.from_pretrained("facebook/m2m100-12B-last-ckpt")
# translate Hindi to French
tokenizer.src_lang = "hi"
encoded_hi = tokenizer(hi_text, return_tensors="pt")
generated_tokens = model.generate(**encoded_hi, forced_bos_token_id=tokenizer.get_lang_id("fr"))
tokenizer.batch_decode(generated_tokens, skip_special_tokens=True)
# => "La vie est comme une boîte de chocolat."# translate Chinese to English
tokenizer.src_lang = "zh"
encoded_zh = tokenizer(chinese_text, return_tensors="pt")
generated_tokens = model.generate(**encoded_zh, forced_bos_token_id=tokenizer.get_lang_id("en"))
tokenizer.batch_decode(generated_tokens, skip_special_tokens=True)
# => "Life is like a box of chocolate."
See the
model hub
to look for more fine-tuned versions.
Languages covered
Afrikaans (af), Amharic (am), Arabic (ar), Asturian (ast), Azerbaijani (az), Bashkir (ba), Belarusian (be), Bulgarian (bg), Bengali (bn), Breton (br), Bosnian (bs), Catalan; Valencian (ca), Cebuano (ceb), Czech (cs), Welsh (cy), Danish (da), German (de), Greeek (el), English (en), Spanish (es), Estonian (et), Persian (fa), Fulah (ff), Finnish (fi), French (fr), Western Frisian (fy), Irish (ga), Gaelic; Scottish Gaelic (gd), Galician (gl), Gujarati (gu), Hausa (ha), Hebrew (he), Hindi (hi), Croatian (hr), Haitian; Haitian Creole (ht), Hungarian (hu), Armenian (hy), Indonesian (id), Igbo (ig), Iloko (ilo), Icelandic (is), Italian (it), Japanese (ja), Javanese (jv), Georgian (ka), Kazakh (kk), Central Khmer (km), Kannada (kn), Korean (ko), Luxembourgish; Letzeburgesch (lb), Ganda (lg), Lingala (ln), Lao (lo), Lithuanian (lt), Latvian (lv), Malagasy (mg), Macedonian (mk), Malayalam (ml), Mongolian (mn), Marathi (mr), Malay (ms), Burmese (my), Nepali (ne), Dutch; Flemish (nl), Norwegian (no), Northern Sotho (ns), Occitan (post 1500) (oc), Oriya (or), Panjabi; Punjabi (pa), Polish (pl), Pushto; Pashto (ps), Portuguese (pt), Romanian; Moldavian; Moldovan (ro), Russian (ru), Sindhi (sd), Sinhala; Sinhalese (si), Slovak (sk), Slovenian (sl), Somali (so), Albanian (sq), Serbian (sr), Swati (ss), Sundanese (su), Swedish (sv), Swahili (sw), Tamil (ta), Thai (th), Tagalog (tl), Tswana (tn), Turkish (tr), Ukrainian (uk), Urdu (ur), Uzbek (uz), Vietnamese (vi), Wolof (wo), Xhosa (xh), Yiddish (yi), Yoruba (yo), Chinese (zh), Zulu (zu)
BibTeX entry and citation info
@misc{fan2020englishcentric,
title={Beyond English-Centric Multilingual Machine Translation},
author={Angela Fan and Shruti Bhosale and Holger Schwenk and Zhiyi Ma and Ahmed El-Kishky and Siddharth Goyal and Mandeep Baines and Onur Celebi and Guillaume Wenzek and Vishrav Chaudhary and Naman Goyal and Tom Birch and Vitaliy Liptchinsky and Sergey Edunov and Edouard Grave and Michael Auli and Armand Joulin},
year={2020},
eprint={2010.11125},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
Runs of michaelfeil ct2fast-m2m100-12B-last-ckpt on huggingface.co
9
Total runs
0
24-hour runs
0
3-day runs
1
7-day runs
-3
30-day runs
More Information About ct2fast-m2m100-12B-last-ckpt huggingface.co Model
More ct2fast-m2m100-12B-last-ckpt license Visit here:
ct2fast-m2m100-12B-last-ckpt huggingface.co is an AI model on huggingface.co that provides ct2fast-m2m100-12B-last-ckpt's model effect (), which can be used instantly with this michaelfeil ct2fast-m2m100-12B-last-ckpt model. huggingface.co supports a free trial of the ct2fast-m2m100-12B-last-ckpt model, and also provides paid use of the ct2fast-m2m100-12B-last-ckpt. Support call ct2fast-m2m100-12B-last-ckpt model through api, including Node.js, Python, http.
ct2fast-m2m100-12B-last-ckpt huggingface.co is an online trial and call api platform, which integrates ct2fast-m2m100-12B-last-ckpt's modeling effects, including api services, and provides a free online trial of ct2fast-m2m100-12B-last-ckpt, you can try ct2fast-m2m100-12B-last-ckpt online for free by clicking the link below.
michaelfeil ct2fast-m2m100-12B-last-ckpt online free url in huggingface.co:
ct2fast-m2m100-12B-last-ckpt is an open source model from GitHub that offers a free installation service, and any user can find ct2fast-m2m100-12B-last-ckpt on GitHub to install. At the same time, huggingface.co provides the effect of ct2fast-m2m100-12B-last-ckpt install, users can directly use ct2fast-m2m100-12B-last-ckpt installed effect in huggingface.co for debugging and trial. It also supports api for free installation.
ct2fast-m2m100-12B-last-ckpt install url in huggingface.co: