ntu-spml / distilhubert

huggingface.co
Total runs: 2.9M
24-hour runs: 420.2K
7-day runs: 1.7M
30-day runs: 2.9M
Model's Last Updated: July 25 2023
feature-extraction

Introduction of distilhubert

Model Details of distilhubert

DistilHuBERT

DistilHuBERT by NTU Speech Processing & Machine Learning Lab

The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz.

Note : This model does not have a tokenizer as it was pretrained on audio alone. In order to use this model speech recognition , a tokenizer should be created and the model should be fine-tuned on labeled text data. Check out this blog for more in-detail explanation of how to fine-tune the model.

Paper: DistilHuBERT: Speech Representation Learning by Layer-wise Distillation of Hidden-unit BERT

Authors: Heng-Jui Chang, Shu-wen Yang, Hung-yi Lee

Abstract Self-supervised speech representation learning methods like wav2vec 2.0 and Hidden-unit BERT (HuBERT) leverage unlabeled speech data for pre-training and offer good representations for numerous speech processing tasks. Despite the success of these methods, they require large memory and high pre-training costs, making them inaccessible for researchers in academia and small companies. Therefore, this paper introduces DistilHuBERT, a novel multi-task learning framework to distill hidden representations from a HuBERT model directly. This method reduces HuBERT's size by 75% and 73% faster while retaining most performance in ten different tasks. Moreover, DistilHuBERT required little training time and data, opening the possibilities of pre-training personal and on-device SSL models for speech.

The original model can be found under https://github.com/s3prl/s3prl/tree/master/s3prl/upstream/distiller .

Usage

See this blog for more information on how to fine-tune the model. Note that the class Wav2Vec2ForCTC has to be replaced by HubertForCTC .

Runs of ntu-spml distilhubert on huggingface.co

2.9M
Total runs
420.2K
24-hour runs
993.4K
3-day runs
1.7M
7-day runs
2.9M
30-day runs

More Information About distilhubert huggingface.co Model

More distilhubert license Visit here:

https://choosealicense.com/licenses/apache-2.0

distilhubert huggingface.co

distilhubert huggingface.co is an AI model on huggingface.co that provides distilhubert's model effect (), which can be used instantly with this ntu-spml distilhubert model. huggingface.co supports a free trial of the distilhubert model, and also provides paid use of the distilhubert. Support call distilhubert model through api, including Node.js, Python, http.

distilhubert huggingface.co Url

https://huggingface.co/ntu-spml/distilhubert

ntu-spml distilhubert online free

distilhubert huggingface.co is an online trial and call api platform, which integrates distilhubert's modeling effects, including api services, and provides a free online trial of distilhubert, you can try distilhubert online for free by clicking the link below.

ntu-spml distilhubert online free url in huggingface.co:

https://huggingface.co/ntu-spml/distilhubert

distilhubert install

distilhubert is an open source model from GitHub that offers a free installation service, and any user can find distilhubert on GitHub to install. At the same time, huggingface.co provides the effect of distilhubert install, users can directly use distilhubert installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

distilhubert install url in huggingface.co:

https://huggingface.co/ntu-spml/distilhubert

Url of distilhubert

distilhubert huggingface.co Url

Provider of distilhubert huggingface.co

ntu-spml
ORGANIZATIONS