Salesforce / codet5-large-ntp-py

huggingface.co
Total runs: 119
24-hour runs: 0
7-day runs: -6.0K
30-day runs: -8.9K
Model's Last Updated: January 21 2025
text2text-generation

Introduction of codet5-large-ntp-py

Model Details of codet5-large-ntp-py

CodeT5 (large-size model pretrained with NTP objective on Python)

Model description

CodeT5 is a family of encoder-decoder language models for code from the paper: CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation by Yue Wang, Weishi Wang, Shafiq Joty, and Steven C.H. Hoi.

The checkpoint included in this repository is denoted as CodeT5-large-ntp-py (770M), which is introduced by the paper: CodeRL: Mastering Code Generation through Pretrained Models and Deep Reinforcement Learning by Hung Le, Yue Wang, Akhilesh Deepak Gotmare, Silvio Savarese, Steven C.H. Hoi.

Training data

CodeT5-large-ntp-py was pretrained on CodeSearchNet data in six programming languages (Ruby/JavaScript/Go/Python/Java/PHP) and GCPY (the Python split of Github Code ) data. See Section 4.1 of the paper for more details.

Training procedure

CodeT5-large-ntp-py was first pretrained using Masked Span Prediction (MSP) objective on CodeSearchNet for 150 epochs and on GCPY for 10 epochs, followed by another 10 epochs on GCPY using Next Token Prediction (NTP) objective. See Section 4.1 of the paper for more details.

Evaluation results

We evaluated this checkpoint on APPS benchmark. See Table 5 of the paper for more details.

How to use

This model can be easily loaded using the T5ForConditionalGeneration functionality:

from transformers import AutoTokenizer, T5ForConditionalGeneration
tokenizer = AutoTokenizer.from_pretrained("Salesforce/codet5-large-ntp-py")
model = T5ForConditionalGeneration.from_pretrained("Salesforce/codet5-large-ntp-py")
text = "def hello_world():"
input_ids = tokenizer(text, return_tensors="pt").input_ids

# simply generate a single sequence
generated_ids = model.generate(input_ids, max_length=128)
print(tokenizer.decode(generated_ids[0], skip_special_tokens=True))
BibTeX entry and citation info
@inproceedings{CodeT52021,
  author    = {Yue Wang and Weishi Wang and Shafiq R. Joty and Steven C. H. Hoi},
  title     = {CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation},
  booktitle = {EMNLP},
  pages     = {8696--8708},
  publisher = {Association for Computational Linguistics},
  year      = {2021}
}

@article{CodeRL2022
  author    = {Hung Le, Yue Wang, Akhilesh Deepak Gotmare, Silvio Savarese, Steven C.H. Hoi},
  title     = {CodeRL: Mastering Code Generation through Pretrained Models and Deep Reinforcement Learning},
  journal   = {arXiv preprint},
  volume    = {abs/2207.01780},
  year      = {2022}
}

Runs of Salesforce codet5-large-ntp-py on huggingface.co

119
Total runs
0
24-hour runs
13
3-day runs
-6.0K
7-day runs
-8.9K
30-day runs

More Information About codet5-large-ntp-py huggingface.co Model

More codet5-large-ntp-py license Visit here:

https://choosealicense.com/licenses/bsd-3-clause

codet5-large-ntp-py huggingface.co

codet5-large-ntp-py huggingface.co is an AI model on huggingface.co that provides codet5-large-ntp-py's model effect (), which can be used instantly with this Salesforce codet5-large-ntp-py model. huggingface.co supports a free trial of the codet5-large-ntp-py model, and also provides paid use of the codet5-large-ntp-py. Support call codet5-large-ntp-py model through api, including Node.js, Python, http.

codet5-large-ntp-py huggingface.co Url

https://huggingface.co/Salesforce/codet5-large-ntp-py

Salesforce codet5-large-ntp-py online free

codet5-large-ntp-py huggingface.co is an online trial and call api platform, which integrates codet5-large-ntp-py's modeling effects, including api services, and provides a free online trial of codet5-large-ntp-py, you can try codet5-large-ntp-py online for free by clicking the link below.

Salesforce codet5-large-ntp-py online free url in huggingface.co:

https://huggingface.co/Salesforce/codet5-large-ntp-py

codet5-large-ntp-py install

codet5-large-ntp-py is an open source model from GitHub that offers a free installation service, and any user can find codet5-large-ntp-py on GitHub to install. At the same time, huggingface.co provides the effect of codet5-large-ntp-py install, users can directly use codet5-large-ntp-py installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

codet5-large-ntp-py install url in huggingface.co:

https://huggingface.co/Salesforce/codet5-large-ntp-py

Url of codet5-large-ntp-py

codet5-large-ntp-py huggingface.co Url

Provider of codet5-large-ntp-py huggingface.co

Salesforce
ORGANIZATIONS

Other API from Salesforce

huggingface.co

Total runs: 41.7K
Run Growth: -9.5K
Growth Rate: -22.77%
Updated: January 21 2025
huggingface.co

Total runs: 8.2K
Run Growth: 746
Growth Rate: 9.38%
Updated: February 19 2024
huggingface.co

Total runs: 3.3K
Run Growth: 726
Growth Rate: 21.97%
Updated: January 25 2025
huggingface.co

Total runs: 586
Run Growth: 500
Growth Rate: 86.36%
Updated: January 21 2025
huggingface.co

Total runs: 576
Run Growth: 263
Growth Rate: 45.74%
Updated: January 21 2025
huggingface.co

Total runs: 258
Run Growth: -33
Growth Rate: -12.89%
Updated: January 15 2025
huggingface.co

Total runs: 50
Run Growth: -60
Growth Rate: -125.00%
Updated: January 15 2025