Datascience-Lab / GPT2-small

huggingface.co
Total runs: 222
24-hour runs: 4
7-day runs: 207
30-day runs: 206
Model's Last Updated: August 01 2023
text-generation

Introduction of GPT2-small

Model Details of GPT2-small

KoGPT2-small

Model Batch Size Tokenizer Vocab Size Max Length Parameter Size
GPT2 64 BPE 30,000 1024 108M

DataSet

  • AIhub - 웹데이터 기반 한국어 말뭉치 데이터 (4.8M)
  • KoWiki dump 230701 (1.4M)

Inference Example

from transformers import AutoTokenizer, GPT2LMHeadModel

text = "출근이 힘들면"

tokenizer = AutoTokenizer.from_pretrained('Datascience-Lab/GPT2-small')
model = GPT2LMHeadModel.from_pretrained('Datascience-Lab/GPT2-small')

inputs = tokenizer.encode_plus(text, return_tensors='pt', add_special_tokens=False)

outputs = model.generate(inputs['input_ids'], max_length=128, 
                           repetition_penalty=2.0,
                           pad_token_id=tokenizer.pad_token_id,
                           eos_token_id=tokenizer.eos_token_id,
                           bos_token_id=tokenizer.bos_token_id,
                           use_cache=True,
                           temperature = 0.5)
outputs = tokenizer.decode(outputs[0], skip_special_tokens=True)

# 출력 결과 : '출근이 힘들면 출근을 하지 않는 것이 좋다. 하지만 출퇴근 시간을 늦추는 것은 오히려 건강에 좋지 않다.. 특히나 장시간의 업무로 인해 피로가 쌓이고 면역력이 떨어지면, 피로감이 심해져서 잠들기 어려운 경우가 많다. 이런 경우라면 평소보다 더 많은 양으로 과식을 하거나 무리한 다이어트를 할 수 있다. 따라서 식단 조절과 함께 영양 보충에 신경 써야 한다. 또한 과도한 음식이 체중 감량에 도움을 주므로 적절한 운동량을 유지하는 것도 중요하다.'

Runs of Datascience-Lab GPT2-small on huggingface.co

222
Total runs
4
24-hour runs
27
3-day runs
207
7-day runs
206
30-day runs

More Information About GPT2-small huggingface.co Model

More GPT2-small license Visit here:

https://choosealicense.com/licenses/apache-2.0

GPT2-small huggingface.co

GPT2-small huggingface.co is an AI model on huggingface.co that provides GPT2-small's model effect (), which can be used instantly with this Datascience-Lab GPT2-small model. huggingface.co supports a free trial of the GPT2-small model, and also provides paid use of the GPT2-small. Support call GPT2-small model through api, including Node.js, Python, http.

Datascience-Lab GPT2-small online free

GPT2-small huggingface.co is an online trial and call api platform, which integrates GPT2-small's modeling effects, including api services, and provides a free online trial of GPT2-small, you can try GPT2-small online for free by clicking the link below.

Datascience-Lab GPT2-small online free url in huggingface.co:

https://huggingface.co/Datascience-Lab/GPT2-small

GPT2-small install

GPT2-small is an open source model from GitHub that offers a free installation service, and any user can find GPT2-small on GitHub to install. At the same time, huggingface.co provides the effect of GPT2-small install, users can directly use GPT2-small installed effect in huggingface.co for debugging and trial. It also supports api for free installation.

GPT2-small install url in huggingface.co:

https://huggingface.co/Datascience-Lab/GPT2-small

Url of GPT2-small

Provider of GPT2-small huggingface.co

Datascience-Lab
ORGANIZATIONS