castorini / afriberta_base

huggingface.co
Total runs: 291
24-hour runs: -18
7-day runs: 136
30-day runs: 219
Model's Last Updated: 6월 15 2022
fill-mask

Introduction of afriberta_base

Model Details of afriberta_base

Hugging Face's logo

language:

  • om
  • am
  • rw
  • rn
  • ha
  • ig
  • pcm
  • so
  • sw
  • ti
  • yo
  • multilingual

afriberta_base

Model description

AfriBERTa base is a pretrained multilingual language model with around 111 million parameters. The model has 8 layers, 6 attention heads, 768 hidden units and 3072 feed forward size. The model was pretrained on 11 African languages namely - Afaan Oromoo (also called Oromo), Amharic, Gahuza (a mixed language containing Kinyarwanda and Kirundi), Hausa, Igbo, Nigerian Pidgin, Somali, Swahili, Tigrinya and Yorùbá. The model has been shown to obtain competitive downstream performances on text classification and Named Entity Recognition on several African languages, including those it was not pretrained on.

Intended uses & limitations
How to use

You can use this model with Transformers for any downstream task. For example, assuming we want to finetune this model on a token classification task, we do the following:

>>> from transformers import AutoTokenizer, AutoModelForTokenClassification
>>> model = AutoModelForTokenClassification.from_pretrained("castorini/afriberta_base")
>>> tokenizer = AutoTokenizer.from_pretrained("castorini/afriberta_base")
# we have to manually set the model max length because it is an imported sentencepiece model, which huggingface does not properly support right now
>>> tokenizer.model_max_length = 512 
Limitations and bias
  • This model is possibly limited by its training dataset which are majorly obtained from news articles from a specific span of time. Thus, it may not generalize well.
  • This model is trained on very little data (less than 1 GB), hence it may not have seen enough data to learn very complex linguistic relations.
Training data

The model was trained on an aggregation of datasets from the BBC news website and Common Crawl.

Training procedure

For information on training procedures, please refer to the AfriBERTa paper or repository

BibTeX entry and citation info
@inproceedings{ogueji-etal-2021-small,
    title = "Small Data? No Problem! Exploring the Viability of Pretrained Multilingual Language Models for Low-resourced Languages",
    author = "Ogueji, Kelechi  and
      Zhu, Yuxin  and
      Lin, Jimmy",
    booktitle = "Proceedings of the 1st Workshop on Multilingual Representation Learning",
    month = nov,
    year = "2021",
    address = "Punta Cana, Dominican Republic",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2021.mrl-1.11",
    pages = "116--126",
}

Runs of castorini afriberta_base on huggingface.co

291
Total runs
-18
24-hour runs
-10
3-day runs
136
7-day runs
219
30-day runs

More Information About afriberta_base huggingface.co Model

afriberta_base huggingface.co

afriberta_base huggingface.co is an AI model on huggingface.co that provides afriberta_base's model effect (), which can be used instantly with this castorini afriberta_base model. huggingface.co supports a free trial of the afriberta_base model, and also provides paid use of the afriberta_base. Support call afriberta_base model through api, including Node.js, Python, http.

afriberta_base huggingface.co Url

https://huggingface.co/castorini/afriberta_base

castorini afriberta_base online free

afriberta_base huggingface.co is an online trial and call api platform, which integrates afriberta_base's modeling effects, including api services, and provides a free online trial of afriberta_base, you can try afriberta_base online for free by clicking the link below.

castorini afriberta_base online free url in huggingface.co:

https://huggingface.co/castorini/afriberta_base

afriberta_base install

afriberta_base is an open source model from GitHub that offers a free installation service, and any user can find afriberta_base on GitHub to install. At the same time, huggingface.co provides the effect of afriberta_base install, users can directly use afriberta_base installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

afriberta_base install url in huggingface.co:

https://huggingface.co/castorini/afriberta_base

Url of afriberta_base

afriberta_base huggingface.co Url

Provider of afriberta_base huggingface.co

castorini
ORGANIZATIONS

Other API from castorini