In this repository the MDZ Digital Library team (dbmdz) at the Bavarian State
Library open sources a cased model for Turkish 🎉
🇹🇷 BERTurk
BERTurk is a community-driven cased BERT model for Turkish.
Some datasets used for pretraining and evaluation are contributed from the
awesome Turkish NLP community, as well as the decision for the model name: BERTurk.
Stats
The current version of the model is trained on a filtered and sentence
segmented version of the Turkish
OSCAR corpus
,
a recent Wikipedia dump, various
OPUS corpora
and a
special corpus provided by
Kemal Oflazer
.
The final training corpus has a size of 35GB and 44,04,976,662 tokens.
Thanks to Google's TensorFlow Research Cloud (TFRC) we could train a cased model
on a TPU v3-8 for 2M steps.
For this model we use a vocab size of 128k.
Model weights
Currently only PyTorch-
Transformers
compatible weights are available. If you need access to TensorFlow checkpoints,
please raise an issue!
For questions about our BERT models just open an issue
here
🤗
Acknowledgments
Thanks to
Kemal Oflazer
for providing us
additional large corpora for Turkish. Many thanks to Reyyan Yeniterzi for providing
us the Turkish NER dataset for evaluation.
Research supported with Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
Thanks for providing access to the TFRC ❤️
Thanks to the generous support from the
Hugging Face
team,
it is possible to download both cased and uncased models from their S3 storage 🤗
Runs of dbmdz bert-base-turkish-128k-cased on huggingface.co
1.6K
Total runs
0
24-hour runs
-26
3-day runs
508
7-day runs
83
30-day runs
More Information About bert-base-turkish-128k-cased huggingface.co Model
More bert-base-turkish-128k-cased license Visit here:
bert-base-turkish-128k-cased huggingface.co is an AI model on huggingface.co that provides bert-base-turkish-128k-cased's model effect (), which can be used instantly with this dbmdz bert-base-turkish-128k-cased model. huggingface.co supports a free trial of the bert-base-turkish-128k-cased model, and also provides paid use of the bert-base-turkish-128k-cased. Support call bert-base-turkish-128k-cased model through api, including Node.js, Python, http.
bert-base-turkish-128k-cased huggingface.co is an online trial and call api platform, which integrates bert-base-turkish-128k-cased's modeling effects, including api services, and provides a free online trial of bert-base-turkish-128k-cased, you can try bert-base-turkish-128k-cased online for free by clicking the link below.
dbmdz bert-base-turkish-128k-cased online free url in huggingface.co:
bert-base-turkish-128k-cased is an open source model from GitHub that offers a free installation service, and any user can find bert-base-turkish-128k-cased on GitHub to install. At the same time, huggingface.co provides the effect of bert-base-turkish-128k-cased install, users can directly use bert-base-turkish-128k-cased installed effect in huggingface.co for debugging and trial. It also supports api for free installation.
bert-base-turkish-128k-cased install url in huggingface.co: