facebook / xlm-roberta-xxl

huggingface.co
Total runs: 19.8K
24-hour runs: 0
7-day runs: 11
30-day runs: -1.4K
Model's Last Updated: August 08 2022
fill-mask

Introduction of xlm-roberta-xxl

Model Details of xlm-roberta-xxl

XLM-RoBERTa-XL (xxlarge-sized model)

XLM-RoBERTa-XL model pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages. It was introduced in the paper Larger-Scale Transformers for Multilingual Masked Language Modeling by Naman Goyal, Jingfei Du, Myle Ott, Giri Anantharaman, Alexis Conneau and first released in this repository .

Disclaimer: The team releasing XLM-RoBERTa-XL did not write a model card for this model so this model card has been written by the Hugging Face team.

Model description

XLM-RoBERTa-XL is a extra large multilingual version of RoBERTa. It is pre-trained on 2.5TB of filtered CommonCrawl data containing 100 languages.

RoBERTa is a transformers model pretrained on a large corpus in a self-supervised fashion. This means it was pretrained on the raw texts only, with no humans labeling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts.

More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the sentence.

This way, the model learns an inner representation of 100 languages that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the XLM-RoBERTa-XL model as inputs.

Intended uses & limitations

You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task. See the model hub to look for fine-tuned versions on a task that interests you.

Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked) to make decisions, such as sequence classification, token classification or question answering. For tasks such as text generation, you should look at models like GPT2.

Usage

You can use this model directly with a pipeline for masked language modeling:

>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='facebook/xlm-roberta-xxl')
>>> unmasker("Europe is a <mask> continent.")

[{'score': 0.22996895015239716,
  'token': 28811,
  'token_str': 'European',
  'sequence': 'Europe is a European continent.'},
 {'score': 0.14307449758052826,
  'token': 21334,
  'token_str': 'large',
  'sequence': 'Europe is a large continent.'},
 {'score': 0.12239163368940353,
  'token': 19336,
  'token_str': 'small',
  'sequence': 'Europe is a small continent.'},
 {'score': 0.07025063782930374,
  'token': 18410,
  'token_str': 'vast',
  'sequence': 'Europe is a vast continent.'},
 {'score': 0.032869212329387665,
  'token': 6957,
  'token_str': 'big',
  'sequence': 'Europe is a big continent.'}]

Here is how to use this model to get the features of a given text in PyTorch:

from transformers import AutoTokenizer, AutoModelForMaskedLM

tokenizer = AutoTokenizer.from_pretrained('facebook/xlm-roberta-xxl')
model = AutoModelForMaskedLM.from_pretrained("facebook/xlm-roberta-xxl")

# prepare input
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')

# forward pass
output = model(**encoded_input)
BibTeX entry and citation info
@article{DBLP:journals/corr/abs-2105-00572,
  author    = {Naman Goyal and
               Jingfei Du and
               Myle Ott and
               Giri Anantharaman and
               Alexis Conneau},
  title     = {Larger-Scale Transformers for Multilingual Masked Language Modeling},
  journal   = {CoRR},
  volume    = {abs/2105.00572},
  year      = {2021},
  url       = {https://arxiv.org/abs/2105.00572},
  eprinttype = {arXiv},
  eprint    = {2105.00572},
  timestamp = {Wed, 12 May 2021 15:54:31 +0200},
  biburl    = {https://dblp.org/rec/journals/corr/abs-2105-00572.bib},
  bibsource = {dblp computer science bibliography, https://dblp.org}
}

Runs of facebook xlm-roberta-xxl on huggingface.co

19.8K
Total runs
0
24-hour runs
-14
3-day runs
11
7-day runs
-1.4K
30-day runs

More Information About xlm-roberta-xxl huggingface.co Model

More xlm-roberta-xxl license Visit here:

https://choosealicense.com/licenses/mit

xlm-roberta-xxl huggingface.co

xlm-roberta-xxl huggingface.co is an AI model on huggingface.co that provides xlm-roberta-xxl's model effect (), which can be used instantly with this facebook xlm-roberta-xxl model. huggingface.co supports a free trial of the xlm-roberta-xxl model, and also provides paid use of the xlm-roberta-xxl. Support call xlm-roberta-xxl model through api, including Node.js, Python, http.

xlm-roberta-xxl huggingface.co Url

https://huggingface.co/facebook/xlm-roberta-xxl

facebook xlm-roberta-xxl online free

xlm-roberta-xxl huggingface.co is an online trial and call api platform, which integrates xlm-roberta-xxl's modeling effects, including api services, and provides a free online trial of xlm-roberta-xxl, you can try xlm-roberta-xxl online for free by clicking the link below.

facebook xlm-roberta-xxl online free url in huggingface.co:

https://huggingface.co/facebook/xlm-roberta-xxl

xlm-roberta-xxl install

xlm-roberta-xxl is an open source model from GitHub that offers a free installation service, and any user can find xlm-roberta-xxl on GitHub to install. At the same time, huggingface.co provides the effect of xlm-roberta-xxl install, users can directly use xlm-roberta-xxl installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

xlm-roberta-xxl install url in huggingface.co:

https://huggingface.co/facebook/xlm-roberta-xxl

Url of xlm-roberta-xxl

xlm-roberta-xxl huggingface.co Url

Provider of xlm-roberta-xxl huggingface.co

facebook
ORGANIZATIONS

Other API from facebook

huggingface.co

Total runs: 13.8M
Run Growth: 9.7M
Growth Rate: 70.46%
Updated: March 23 2023
huggingface.co

Total runs: 8.4M
Run Growth: 1.9M
Growth Rate: 23.47%
Updated: September 15 2023
huggingface.co

Total runs: 4.5M
Run Growth: -9.6M
Growth Rate: -215.18%
Updated: January 17 2024
huggingface.co

Total runs: 3.2M
Run Growth: 2.6M
Growth Rate: 81.62%
Updated: September 06 2023
huggingface.co

Total runs: 2.9M
Run Growth: 1.9M
Growth Rate: 65.21%
Updated: September 06 2023
huggingface.co

Total runs: 2.9M
Run Growth: 2.7M
Growth Rate: 92.31%
Updated: January 25 2024
huggingface.co

Total runs: 2.6M
Run Growth: -24.9K
Growth Rate: -0.95%
Updated: November 17 2022
huggingface.co

Total runs: 2.1M
Run Growth: -34.7K
Growth Rate: -1.62%
Updated: November 14 2023
huggingface.co

Total runs: 1.9M
Run Growth: 678.8K
Growth Rate: 35.29%
Updated: February 29 2024
huggingface.co

Total runs: 1.3M
Run Growth: 1.3M
Growth Rate: 96.07%
Updated: June 13 2023
huggingface.co

Total runs: 963.5K
Run Growth: 635.9K
Growth Rate: 65.66%
Updated: September 15 2023
huggingface.co

Total runs: 913.6K
Run Growth: 305.6K
Growth Rate: 33.46%
Updated: November 16 2023
huggingface.co

Total runs: 828.1K
Run Growth: 130.6K
Growth Rate: 15.75%
Updated: January 12 2024
huggingface.co

Total runs: 679.8K
Run Growth: -121.0K
Growth Rate: -17.80%
Updated: December 28 2021
huggingface.co

Total runs: 466.7K
Run Growth: -16.6K
Growth Rate: -3.56%
Updated: September 01 2023
huggingface.co

Total runs: 454.5K
Run Growth: -670.7K
Growth Rate: -147.57%
Updated: July 25 2023
huggingface.co

Total runs: 352.3K
Run Growth: -49.6K
Growth Rate: -14.08%
Updated: January 20 2022
huggingface.co

Total runs: 244.9K
Run Growth: 101.3K
Growth Rate: 42.02%
Updated: September 15 2023
huggingface.co

Total runs: 221.8K
Run Growth: 963
Growth Rate: 0.43%
Updated: September 06 2023
huggingface.co

Total runs: 191.8K
Run Growth: -41.2K
Growth Rate: -21.50%
Updated: June 03 2022
huggingface.co

Total runs: 191.6K
Run Growth: 94.9K
Growth Rate: 50.22%
Updated: January 12 2024
huggingface.co

Total runs: 186.5K
Run Growth: -137.2K
Growth Rate: -73.35%
Updated: January 12 2024
huggingface.co

Total runs: 182.6K
Run Growth: -545.6K
Growth Rate: -298.84%
Updated: June 15 2023
huggingface.co

Total runs: 166.1K
Run Growth: -9.5K
Growth Rate: -5.73%
Updated: September 05 2023
huggingface.co

Total runs: 155.9K
Run Growth: -78.7K
Growth Rate: -50.46%
Updated: May 22 2023
huggingface.co

Total runs: 66.1K
Run Growth: 18.0K
Growth Rate: 27.19%
Updated: May 22 2023
huggingface.co

Total runs: 65.1K
Run Growth: 2.5K
Growth Rate: 3.96%
Updated: September 15 2023
huggingface.co

Total runs: 64.7K
Run Growth: 17.2K
Growth Rate: 27.67%
Updated: January 25 2023
huggingface.co

Total runs: 61.8K
Run Growth: 54.6K
Growth Rate: 88.40%
Updated: November 20 2023
huggingface.co

Total runs: 52.5K
Run Growth: 29.4K
Growth Rate: 55.97%
Updated: March 13 2024
huggingface.co

Total runs: 49.8K
Run Growth: -19.5K
Growth Rate: -39.04%
Updated: October 16 2024
huggingface.co

Total runs: 43.7K
Run Growth: 15.9K
Growth Rate: 36.26%
Updated: February 12 2023
huggingface.co

Total runs: 36.7K
Run Growth: 13.8K
Growth Rate: 37.46%
Updated: September 06 2023
huggingface.co

Total runs: 30.8K
Run Growth: -7.6K
Growth Rate: -24.78%
Updated: June 05 2023