Edit model card

ahxt/LiteLlama-460M-1Tを日英データ3.9Bトークンで継続事前学習したモデルです。

詳細はこちらをご覧ください。

名前の由来は日本の在来馬で体躯の小さい品種であるヨナグニウマからです。

from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline

model = AutoModelForCausalLM.from_pretrained('Kendamarron/Yonaguni-460M-v0.1')
tokenizer = AutoTokenizer.from_pretrained('Kendamarron/Yonaguni-460M-v0.1')

pipe = pipeline('text-generation', model=model, tokenizer=tokenizer)

prompt = "大規模言語モデルとは、"

print(pipe(prompt, max_length=128, repetition_penalty=1.1, temperature=0.7, top_p=0.95))
Downloads last month
9
Safetensors
Model size
462M params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.