In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources an uncased model for Turkish 🎉
🇹🇷 BERTurk
BERTurk is a community-driven uncased BERT model for Turkish.
Some datasets used for pretraining and evaluation are contributed from the
awesome Turkish NLP community, as well as the decision for the model name: BERTurk.
Stats
The current version of the model is trained on a filtered and sentence
segmented version of the Turkish
OSCAR corpus
,
a recent Wikipedia dump, various
OPUS corpora
and a
special corpus provided by
Kemal Oflazer
.
The final training corpus has a size of 35GB and 44,04,976,662 tokens.
Thanks to Google's TensorFlow Research Cloud (TFRC) we could train an uncased model
on a TPU v3-8 for 2M steps.
For this model we use a vocab size of 128k.
Model weights
Currently only PyTorch-
Transformers
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue!
For questions about our BERT models just open an issue
here
🤗
Acknowledgments
Thanks to
Kemal Oflazer
for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the
Hugging Face
team,
it is possible to download both cased and uncased models from their S3 storage 🤗
Runs of dbmdz bert-base-turkish-128k-uncased on huggingface.co
25.3K
Total runs
255
24-hour runs
389
3-day runs
755
7-day runs
2.1K
30-day runs
More Information About bert-base-turkish-128k-uncased huggingface.co Model
More bert-base-turkish-128k-uncased license Visit here:
bert-base-turkish-128k-uncased huggingface.co is an AI model on huggingface.co that provides bert-base-turkish-128k-uncased's model effect (), which can be used instantly with this dbmdz bert-base-turkish-128k-uncased model. huggingface.co supports a free trial of the bert-base-turkish-128k-uncased model, and also provides paid use of the bert-base-turkish-128k-uncased. Support call bert-base-turkish-128k-uncased model through api, including Node.js, Python, http.
bert-base-turkish-128k-uncased huggingface.co is an online trial and call api platform, which integrates bert-base-turkish-128k-uncased's modeling effects, including api services, and provides a free online trial of bert-base-turkish-128k-uncased, you can try bert-base-turkish-128k-uncased online for free by clicking the link below.
dbmdz bert-base-turkish-128k-uncased online free url in huggingface.co:
bert-base-turkish-128k-uncased is an open source model from GitHub that offers a free installation service, and any user can find bert-base-turkish-128k-uncased on GitHub to install. At the same time, huggingface.co provides the effect of bert-base-turkish-128k-uncased install, users can directly use bert-base-turkish-128k-uncased installed effect in huggingface.co for debugging and trial. It also supports api for free installation.
bert-base-turkish-128k-uncased install url in huggingface.co: