Salesforce / codet5-large

huggingface.co
Total runs: 4.0K
24-hour runs: 0
7-day runs: 401
30-day runs: 757
Model's Last Updated: January 21 2025
text2text-generation

Introduction of codet5-large

Model Details of codet5-large

CodeT5 (large-size model 770M)

Model description

CodeT5 is a family of encoder-decoder language models for code from the paper: CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation by Yue Wang, Weishi Wang, Shafiq Joty, and Steven C.H. Hoi.

The checkpoint included in this repository is denoted as CodeT5-large (770M), which is introduced by the paper: CodeRL: Mastering Code Generation through Pretrained Models and Deep Reinforcement Learning by Hung Le, Yue Wang, Akhilesh Deepak Gotmare, Silvio Savarese, Steven C.H. Hoi.

Training data

CodeT5-large was pretrained on CodeSearchNet data in six programming languages (Ruby/JavaScript/Go/Python/Java/PHP). See Section 4.1 of the paper for more details.

Training procedure

CodeT5-large was pretrained using masked span prediction objective for 150 epochs. See Section 4.1 of the paper for more details.

Evaluation results

We validate the effectiveness of this checkpoint pretrained with simplified strategies on CodeXGLUE benchmark. See Appendix A.1 of the paper for more details.

How to use

This model can be easily loaded using the T5ForConditionalGeneration functionality:

from transformers import AutoTokenizer, T5ForConditionalGeneration
tokenizer = AutoTokenizer.from_pretrained("Salesforce/codet5-large")
model = T5ForConditionalGeneration.from_pretrained("Salesforce/codet5-large")
text = "def greet(user): print(f'hello <extra_id_0>!')"
input_ids = tokenizer(text, return_tensors="pt").input_ids

# simply generate a single sequence
generated_ids = model.generate(input_ids, max_length=8)
print(tokenizer.decode(generated_ids[0], skip_special_tokens=True))
BibTeX entry and citation info
@inproceedings{CodeT52021,
  author    = {Yue Wang and Weishi Wang and Shafiq R. Joty and Steven C. H. Hoi},
  title     = {CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation},
  booktitle = {EMNLP},
  pages     = {8696--8708},
  publisher = {Association for Computational Linguistics},
  year      = {2021}
}

@article{CodeRL2022
  author    = {Hung Le, Yue Wang, Akhilesh Deepak Gotmare, Silvio Savarese, Steven C.H. Hoi},
  title     = {CodeRL: Mastering Code Generation through Pretrained Models and Deep Reinforcement Learning},
  journal   = {arXiv preprint},
  volume    = {abs/2207.01780},
  year      = {2022}
}

Runs of Salesforce codet5-large on huggingface.co

4.0K
Total runs
0
24-hour runs
-16
3-day runs
401
7-day runs
757
30-day runs

More Information About codet5-large huggingface.co Model

More codet5-large license Visit here:

https://choosealicense.com/licenses/bsd-3-clause

codet5-large huggingface.co

codet5-large huggingface.co is an AI model on huggingface.co that provides codet5-large's model effect (), which can be used instantly with this Salesforce codet5-large model. huggingface.co supports a free trial of the codet5-large model, and also provides paid use of the codet5-large. Support call codet5-large model through api, including Node.js, Python, http.

Salesforce codet5-large online free

codet5-large huggingface.co is an online trial and call api platform, which integrates codet5-large's modeling effects, including api services, and provides a free online trial of codet5-large, you can try codet5-large online for free by clicking the link below.

Salesforce codet5-large online free url in huggingface.co:

https://huggingface.co/Salesforce/codet5-large

codet5-large install

codet5-large is an open source model from GitHub that offers a free installation service, and any user can find codet5-large on GitHub to install. At the same time, huggingface.co provides the effect of codet5-large install, users can directly use codet5-large installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

codet5-large install url in huggingface.co:

https://huggingface.co/Salesforce/codet5-large

Url of codet5-large

codet5-large huggingface.co Url

Provider of codet5-large huggingface.co

Salesforce
ORGANIZATIONS

Other API from Salesforce

huggingface.co

Total runs: 37.7K
Run Growth: -16.9K
Growth Rate: -44.71%
Updated: January 21 2025
huggingface.co

Total runs: 8.7K
Run Growth: 1.8K
Growth Rate: 20.76%
Updated: February 19 2024
huggingface.co

Total runs: 2.7K
Run Growth: 669
Growth Rate: 24.14%
Updated: January 25 2025
huggingface.co

Total runs: 688
Run Growth: 605
Growth Rate: 90.03%
Updated: January 21 2025
huggingface.co

Total runs: 564
Run Growth: 224
Growth Rate: 39.79%
Updated: January 21 2025
huggingface.co

Total runs: 290
Run Growth: 1
Growth Rate: 0.34%
Updated: January 15 2025
huggingface.co

Total runs: 49
Run Growth: -56
Growth Rate: -114.29%
Updated: January 15 2025