castorini / afriteva_base

huggingface.co
Total runs: 149
24-hour runs: -25
7-day runs: -46
30-day runs: 22
Model's Last Updated: June 30 2024
text2text-generation

Introduction of afriteva_base

Model Details of afriteva_base

Hugging Face's logo

language:

  • om
  • am
  • rw
  • rn
  • ha
  • ig
  • pcm
  • so
  • sw
  • ti
  • yo
  • multilingual

afriteva_base

Model desription

AfriTeVa base is a multilingual sequence to sequence model pretrained on 10 African languages

Languages

Afaan Oromoo(orm), Amharic(amh), Gahuza(gah), Hausa(hau), Igbo(igb), Nigerian Pidgin(pcm), Somali(som), Swahili(swa), Tigrinya(tig), Yoruba(yor)

More information on the model, dataset:
The model
  • 229M parameters encoder-decoder architecture (T5-like)
  • 12 layers, 12 attention heads and 512 token sequence length
The dataset
  • Multilingual: 10 African languages listed above
  • 143 Million Tokens (1GB of text data)
  • Tokenizer Vocabulary Size: 70,000 tokens
Intended uses & limitations

afriteva_base is pre-trained model and primarily aimed at being fine-tuned on multilingual sequence-to-sequence tasks.

>>> from transformers import AutoModelForSeq2SeqLM, AutoTokenizer

>>> tokenizer = AutoTokenizer.from_pretrained("castorini/afriteva_base")
>>> model = AutoModelForSeq2SeqLM.from_pretrained("castorini/afriteva_base")

>>> src_text = "Ó hùn ọ́ láti di ara wa bí?"
>>> tgt_text =  "Would you like to be?"

>>> model_inputs = tokenizer(src_text, return_tensors="pt")
>>> with tokenizer.as_target_tokenizer():
        labels = tokenizer(tgt_text, return_tensors="pt").input_ids

>>> model(**model_inputs, labels=labels) # forward pass
Training Procedure

For information on training procedures, please refer to the AfriTeVa paper or repository

BibTex entry and Citation info

coming soon ...

Runs of castorini afriteva_base on huggingface.co

149
Total runs
-25
24-hour runs
-27
3-day runs
-46
7-day runs
22
30-day runs

More Information About afriteva_base huggingface.co Model

afriteva_base huggingface.co

afriteva_base huggingface.co is an AI model on huggingface.co that provides afriteva_base's model effect (), which can be used instantly with this castorini afriteva_base model. huggingface.co supports a free trial of the afriteva_base model, and also provides paid use of the afriteva_base. Support call afriteva_base model through api, including Node.js, Python, http.

afriteva_base huggingface.co Url

https://huggingface.co/castorini/afriteva_base

castorini afriteva_base online free

afriteva_base huggingface.co is an online trial and call api platform, which integrates afriteva_base's modeling effects, including api services, and provides a free online trial of afriteva_base, you can try afriteva_base online for free by clicking the link below.

castorini afriteva_base online free url in huggingface.co:

https://huggingface.co/castorini/afriteva_base

afriteva_base install

afriteva_base is an open source model from GitHub that offers a free installation service, and any user can find afriteva_base on GitHub to install. At the same time, huggingface.co provides the effect of afriteva_base install, users can directly use afriteva_base installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

afriteva_base install url in huggingface.co:

https://huggingface.co/castorini/afriteva_base

Url of afriteva_base

afriteva_base huggingface.co Url

Provider of afriteva_base huggingface.co

castorini
ORGANIZATIONS

Other API from castorini

huggingface.co

Total runs: 123
Run Growth: 119
Growth Rate: 96.75%
Updated: November 05 2021