GraphCodeBERT is a graph-based pre-trained model based on the Transformer architecture for programming language, which also considers data-flow information along with code sequences. GraphCodeBERT consists of 12 layers, 768 dimensional hidden states, and 12 attention heads. The maximum sequence length for the model is 512. The model is trained on the CodeSearchNet dataset, which includes 2.3M functions with document pairs for six programming languages.
More details can be found in the
paper
by Guo et. al.
Disclaimer:
The team releasing BERT did not write a model card for this model so this model card has been written by the Hugging Face community members.
Runs of microsoft graphcodebert-base on huggingface.co
105.0K
Total runs
442
24-hour runs
17.7K
3-day runs
10.0K
7-day runs
55.2K
30-day runs
More Information About graphcodebert-base huggingface.co Model
graphcodebert-base huggingface.co
graphcodebert-base huggingface.co is an AI model on huggingface.co that provides graphcodebert-base's model effect (), which can be used instantly with this microsoft graphcodebert-base model. huggingface.co supports a free trial of the graphcodebert-base model, and also provides paid use of the graphcodebert-base. Support call graphcodebert-base model through api, including Node.js, Python, http.
graphcodebert-base huggingface.co is an online trial and call api platform, which integrates graphcodebert-base's modeling effects, including api services, and provides a free online trial of graphcodebert-base, you can try graphcodebert-base online for free by clicking the link below.
microsoft graphcodebert-base online free url in huggingface.co:
graphcodebert-base is an open source model from GitHub that offers a free installation service, and any user can find graphcodebert-base on GitHub to install. At the same time, huggingface.co provides the effect of graphcodebert-base install, users can directly use graphcodebert-base installed effect in huggingface.co for debugging and trial. It also supports api for free installation.