shibing624 / code-autocomplete-gpt2-base

huggingface.co
Total runs: 159
24-hour runs: 94
7-day runs: 32
30-day runs: -69
Model's Last Updated: Bước đều 19 2023
text-generation

Introduction of code-autocomplete-gpt2-base

Model Details of code-autocomplete-gpt2-base

GPT2 for Code AutoComplete Model

code-autocomplete, a code completion plugin for Python.

code-autocomplete can automatically complete the code of lines and blocks with GPT2.

Usage

Open source repo: code-autocomplete ,support GPT2 model, usage:

from autocomplete.gpt2_coder import GPT2Coder

m = GPT2Coder("shibing624/code-autocomplete-gpt2-base")
print(m.generate('import torch.nn as')[0])

Also, use huggingface/transformers:

Please use 'GPT2' related functions to load this model!

import os
import torch
from transformers import GPT2Tokenizer, GPT2LMHeadModel

os.environ["KMP_DUPLICATE_LIB_OK"] = "TRUE"
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")

tokenizer = GPT2Tokenizer.from_pretrained("shibing624/code-autocomplete-gpt2-base")
model = GPT2LMHeadModel.from_pretrained("shibing624/code-autocomplete-gpt2-base")
model.to(device)
prompts = [
    """from torch import nn
    class LSTM(Module):
        def __init__(self, *,
                     n_tokens: int,
                     embedding_size: int,
                     hidden_size: int,
                     n_layers: int):""",
    """import numpy as np
    import torch
    import torch.nn as""",
    "import java.util.ArrayList",
    "def factorial(n):",
]
for prompt in prompts:
    input_ids = tokenizer.encode(prompt, add_special_tokens=False, return_tensors='pt').to(device)
    outputs = model.generate(input_ids=input_ids,
                             max_length=64 + len(prompt),
                             temperature=1.0,
                             top_k=50,
                             top_p=0.95,
                             repetition_penalty=1.0,
                             do_sample=True,
                             num_return_sequences=1,
                             length_penalty=2.0,
                             early_stopping=True)
    decoded = tokenizer.decode(outputs[0], skip_special_tokens=True)
    print(decoded)
    print("=" * 20)

output: ```shell from torch import nn class LSTM(Module): def init (self, *, n_tokens: int, embedding_size: int, hidden_size: int, n_layers: int): self.embedding_size = embedding_size

import numpy as np import torch import torch.nn as nn import torch.nn.functional as F


Model files:

code-autocomplete-gpt2-base ├── config.json ├── merges.txt ├── pytorch_model.bin ├── special_tokens_map.json ├── tokenizer_config.json └── vocab.json


### Train data
#### pytorch_awesome projects source code

download [code-autocomplete](https://github.com/shibing624/code-autocomplete),
```shell
cd autocomplete
python create_dataset.py

If you want train code-autocomplete GPT2 model,refer https://github.com/shibing624/code-autocomplete/blob/main/autocomplete/gpt2_coder.py

About GPT2

Test the whole generation capabilities here: https://transformer.huggingface.co/doc/gpt2-large

Pretrained model on English language using a causal language modeling (CLM) objective. It was introduced in this paper and first released at this page .

Disclaimer: The team releasing GPT-2 also wrote a model card for their model. Content from this model card has been written by the Hugging Face team to complete the information they provided and give specific examples of bias.

Citation
@misc{code-autocomplete,
  author = {Xu Ming},
  title = {code-autocomplete: Code AutoComplete with GPT model},
  year = {2022},
  publisher = {GitHub},
  journal = {GitHub repository},
  url = {https://github.com/shibing624/code-autocomplete},
}

Runs of shibing624 code-autocomplete-gpt2-base on huggingface.co

159
Total runs
94
24-hour runs
72
3-day runs
32
7-day runs
-69
30-day runs

More Information About code-autocomplete-gpt2-base huggingface.co Model

More code-autocomplete-gpt2-base license Visit here:

https://choosealicense.com/licenses/apache-2.0

code-autocomplete-gpt2-base huggingface.co

code-autocomplete-gpt2-base huggingface.co is an AI model on huggingface.co that provides code-autocomplete-gpt2-base's model effect (), which can be used instantly with this shibing624 code-autocomplete-gpt2-base model. huggingface.co supports a free trial of the code-autocomplete-gpt2-base model, and also provides paid use of the code-autocomplete-gpt2-base. Support call code-autocomplete-gpt2-base model through api, including Node.js, Python, http.

code-autocomplete-gpt2-base huggingface.co Url

https://huggingface.co/shibing624/code-autocomplete-gpt2-base

shibing624 code-autocomplete-gpt2-base online free

code-autocomplete-gpt2-base huggingface.co is an online trial and call api platform, which integrates code-autocomplete-gpt2-base's modeling effects, including api services, and provides a free online trial of code-autocomplete-gpt2-base, you can try code-autocomplete-gpt2-base online for free by clicking the link below.

shibing624 code-autocomplete-gpt2-base online free url in huggingface.co:

https://huggingface.co/shibing624/code-autocomplete-gpt2-base

code-autocomplete-gpt2-base install

code-autocomplete-gpt2-base is an open source model from GitHub that offers a free installation service, and any user can find code-autocomplete-gpt2-base on GitHub to install. At the same time, huggingface.co provides the effect of code-autocomplete-gpt2-base install, users can directly use code-autocomplete-gpt2-base installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

code-autocomplete-gpt2-base install url in huggingface.co:

https://huggingface.co/shibing624/code-autocomplete-gpt2-base

Url of code-autocomplete-gpt2-base

code-autocomplete-gpt2-base huggingface.co Url

Provider of code-autocomplete-gpt2-base huggingface.co

shibing624
ORGANIZATIONS

Other API from shibing624

huggingface.co

Total runs: 68
Run Growth: -16
Growth Rate: -22.86%
Updated: Tháng hai 19 2024