shibing624 / code-autocomplete-distilgpt2-python

huggingface.co
Total runs: 44
24-hour runs: 0
7-day runs: 2
30-day runs: -6
Model's Last Updated: February 19 2024
text-generation

Introduction of code-autocomplete-distilgpt2-python

Model Details of code-autocomplete-distilgpt2-python

GPT2 for Code AutoComplete Model

code-autocomplete, a code completion plugin for Python.

code-autocomplete can automatically complete the code of lines and blocks with GPT2.

Usage

Open source repo: code-autocomplete ,support GPT2 model, usage:

from autocomplete.gpt2_coder import GPT2Coder

m = GPT2Coder("shibing624/code-autocomplete-distilgpt2-python")
print(m.generate('import torch.nn as')[0])

Also, use huggingface/transformers:

Please use 'GPT2' related functions to load this model!

import os
from transformers import GPT2Tokenizer, GPT2LMHeadModel

os.environ["KMP_DUPLICATE_LIB_OK"] = "TRUE"

tokenizer = GPT2Tokenizer.from_pretrained("shibing624/code-autocomplete-distilgpt2-python")
model = GPT2LMHeadModel.from_pretrained("shibing624/code-autocomplete-distilgpt2-python")

prompts = [
    """from torch import nn
    class LSTM(Module):
        def __init__(self, *,
                     n_tokens: int,
                     embedding_size: int,
                     hidden_size: int,
                     n_layers: int):""",
    """import numpy as np
    import torch
    import torch.nn as""",
    "import java.util.ArrayList",
    "def factorial(n):",
]
for prompt in prompts:
    input_ids = tokenizer.encode(prompt, add_special_tokens=False, return_tensors='pt')
    outputs = model.generate(input_ids=input_ids,
                             max_length=64 + len(prompt),
                             temperature=1.0,
                             top_k=50,
                             top_p=0.95,
                             repetition_penalty=1.0,
                             do_sample=True,
                             num_return_sequences=1,
                             length_penalty=2.0,
                             early_stopping=True)
    decoded = tokenizer.decode(outputs[0], skip_special_tokens=True)
    print(decoded)
    print("=" * 20)

output: ```shell from torch import nn class LSTM(Module): def init (self, *, n_tokens: int, embedding_size: int, hidden_size: int, n_layers: int): self.embedding_size = embedding_size

import numpy as np import torch import torch.nn as nn import torch.nn.functional as F


Model files:

code-autocomplete-distilgpt2-python ├── config.json ├── merges.txt ├── pytorch_model.bin ├── special_tokens_map.json ├── tokenizer_config.json └── vocab.json


### Train data
#### pytorch_awesome projects source code

download [code-autocomplete](https://github.com/shibing624/code-autocomplete),
```shell
cd autocomplete
python create_dataset.py

If you want train code-autocomplete GPT2 model,refer https://github.com/shibing624/code-autocomplete/blob/main/autocomplete/gpt2_coder.py

About GPT2

Test the whole generation capabilities here: https://transformer.huggingface.co/doc/gpt2-large

Pretrained model on English language using a causal language modeling (CLM) objective. It was introduced in this paper and first released at this page .

Disclaimer: The team releasing GPT-2 also wrote a model card for their model. Content from this model card has been written by the Hugging Face team to complete the information they provided and give specific examples of bias.

Citation
@misc{code-autocomplete,
  author = {Xu Ming},
  title = {code-autocomplete: Code AutoComplete with GPT model},
  year = {2022},
  publisher = {GitHub},
  journal = {GitHub repository},
  url = {https://github.com/shibing624/code-autocomplete},
}

Runs of shibing624 code-autocomplete-distilgpt2-python on huggingface.co

44
Total runs
0
24-hour runs
-2
3-day runs
2
7-day runs
-6
30-day runs

More Information About code-autocomplete-distilgpt2-python huggingface.co Model

More code-autocomplete-distilgpt2-python license Visit here:

https://choosealicense.com/licenses/apache-2.0

code-autocomplete-distilgpt2-python huggingface.co

code-autocomplete-distilgpt2-python huggingface.co is an AI model on huggingface.co that provides code-autocomplete-distilgpt2-python's model effect (), which can be used instantly with this shibing624 code-autocomplete-distilgpt2-python model. huggingface.co supports a free trial of the code-autocomplete-distilgpt2-python model, and also provides paid use of the code-autocomplete-distilgpt2-python. Support call code-autocomplete-distilgpt2-python model through api, including Node.js, Python, http.

code-autocomplete-distilgpt2-python huggingface.co Url

https://huggingface.co/shibing624/code-autocomplete-distilgpt2-python

shibing624 code-autocomplete-distilgpt2-python online free

code-autocomplete-distilgpt2-python huggingface.co is an online trial and call api platform, which integrates code-autocomplete-distilgpt2-python's modeling effects, including api services, and provides a free online trial of code-autocomplete-distilgpt2-python, you can try code-autocomplete-distilgpt2-python online for free by clicking the link below.

shibing624 code-autocomplete-distilgpt2-python online free url in huggingface.co:

https://huggingface.co/shibing624/code-autocomplete-distilgpt2-python

code-autocomplete-distilgpt2-python install

code-autocomplete-distilgpt2-python is an open source model from GitHub that offers a free installation service, and any user can find code-autocomplete-distilgpt2-python on GitHub to install. At the same time, huggingface.co provides the effect of code-autocomplete-distilgpt2-python install, users can directly use code-autocomplete-distilgpt2-python installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

code-autocomplete-distilgpt2-python install url in huggingface.co:

https://huggingface.co/shibing624/code-autocomplete-distilgpt2-python

Url of code-autocomplete-distilgpt2-python

code-autocomplete-distilgpt2-python huggingface.co Url

Provider of code-autocomplete-distilgpt2-python huggingface.co

shibing624
ORGANIZATIONS

Other API from shibing624