CodeT5-large was pretrained on
CodeSearchNet
data in six programming languages (Ruby/JavaScript/Go/Python/Java/PHP). See Section 4.1 of the
paper
for more details.
Training procedure
CodeT5-large was pretrained using masked span prediction objective for 150 epochs. See Section 4.1 of the
paper
for more details.
Evaluation results
We validate the effectiveness of this checkpoint pretrained with simplified strategies on
CodeXGLUE
benchmark. See Appendix A.1 of the
paper
for more details.
How to use
This model can be easily loaded using the
T5ForConditionalGeneration
functionality:
from transformers import AutoTokenizer, T5ForConditionalGeneration
tokenizer = AutoTokenizer.from_pretrained("Salesforce/codet5-large")
model = T5ForConditionalGeneration.from_pretrained("Salesforce/codet5-large")
text = "def greet(user): print(f'hello <extra_id_0>!')"
input_ids = tokenizer(text, return_tensors="pt").input_ids
# simply generate a single sequence
generated_ids = model.generate(input_ids, max_length=8)
print(tokenizer.decode(generated_ids[0], skip_special_tokens=True))
BibTeX entry and citation info
@inproceedings{CodeT52021,
author = {Yue Wang and Weishi Wang and Shafiq R. Joty and Steven C. H. Hoi},
title = {CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation},
booktitle = {EMNLP},
pages = {8696--8708},
publisher = {Association for Computational Linguistics},
year = {2021}
}
@article{CodeRL2022
author = {Hung Le, Yue Wang, Akhilesh Deepak Gotmare, Silvio Savarese, Steven C.H. Hoi},
title = {CodeRL: Mastering Code Generation through Pretrained Models and Deep Reinforcement Learning},
journal = {arXiv preprint},
volume = {abs/2207.01780},
year = {2022}
}
Runs of Salesforce codet5-large on huggingface.co
4.0K
Total runs
0
24-hour runs
-16
3-day runs
401
7-day runs
757
30-day runs
More Information About codet5-large huggingface.co Model
codet5-large huggingface.co is an AI model on huggingface.co that provides codet5-large's model effect (), which can be used instantly with this Salesforce codet5-large model. huggingface.co supports a free trial of the codet5-large model, and also provides paid use of the codet5-large. Support call codet5-large model through api, including Node.js, Python, http.
codet5-large huggingface.co is an online trial and call api platform, which integrates codet5-large's modeling effects, including api services, and provides a free online trial of codet5-large, you can try codet5-large online for free by clicking the link below.
Salesforce codet5-large online free url in huggingface.co:
codet5-large is an open source model from GitHub that offers a free installation service, and any user can find codet5-large on GitHub to install. At the same time, huggingface.co provides the effect of codet5-large install, users can directly use codet5-large installed effect in huggingface.co for debugging and trial. It also supports api for free installation.