daekeun-ml commited on
Commit
6e563cc
β€’
1 Parent(s): 5791b0d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +28 -1
README.md CHANGED
@@ -20,7 +20,7 @@ pipeline_tag: text-generation
20
  ## Model Details
21
  This model is trained using unsloth toolkit based on Microsoft's phi-3 model with some Korean instruction data added to enhance its Korean generation performance
22
 
23
- Since my role is not as a working developer, but as ML Technical Specialist helping customers with quick PoCs/prototypes, and I was limited by Azure GPU resources available, I only trained with 40,000 samples on a single A100 GPU () for PoC purposes. Because I have not done any tokenizer extensions, you need a lot more tokens than English for text generation.
24
 
25
  ### Dataset
26
 
@@ -32,6 +32,8 @@ The dataset used for training is as follows. To prevent catastrophic forgetting,
32
 
33
 
34
  ## How to Get Started with the Model
 
 
35
  ```python
36
  ### Load model
37
  import torch
@@ -67,6 +69,7 @@ params = {
67
  ### Inference
68
  FastLanguageModel.for_inference(model) # Enable native 2x faster inference
69
 
 
70
  messages = [
71
  {"from": "human", "value": "Continue the fibonnaci sequence in Korean: 1, 1, 2, 3, 5, 8,"},
72
  {"from": "assistant", "value": "ν”Όλ³΄λ‚˜μΉ˜ μˆ˜μ—΄μ˜ λ‹€μŒ μˆ«μžλŠ” 13, 21, 34, 55, 89 λ“±μž…λ‹ˆλ‹€. 각 μˆ«μžλŠ” μ•žμ˜ 두 숫자의 ν•©μž…λ‹ˆλ‹€."},
@@ -82,6 +85,7 @@ inputs = tokenizer.apply_chat_template(
82
  text_streamer = TextStreamer(tokenizer)
83
  _ = model.generate(input_ids = inputs, streamer = text_streamer, **params)
84
 
 
85
  messages = [
86
  {"from": "human", "value": "What is Machine Learning in Korean?"},
87
  {"from": "assistant", "value": "인곡지λŠ₯의 ν•œ λΆ„μ•Όλ‘œ λ°©λŒ€ν•œ 데이터λ₯Ό 뢄석해 ν–₯ν›„ νŒ¨ν„΄μ„ μ˜ˆμΈ‘ν•˜λŠ” κΈ°λ²•μž…λ‹ˆλ‹€."},
@@ -99,6 +103,29 @@ text_streamer = TextStreamer(tokenizer)
99
  _ = model.generate(input_ids = inputs, streamer = text_streamer, **params)
100
  ```
101
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
102
  ### References
103
  - Base model: [microsoft/phi-2](https://huggingface.co/microsoft/phi-2)
104
 
 
20
  ## Model Details
21
  This model is trained using unsloth toolkit based on Microsoft's phi-3 model with some Korean instruction data added to enhance its Korean generation performance
22
 
23
+ Since my role is not as a working developer, but as ML Technical Specialist helping customers with quick PoCs/prototypes, and I was limited by Azure GPU resources available, I only trained with 40,000 samples on a single VM Azure Standard_NC24ads_A100_v4 for PoC purposes. Because I have not done any tokenizer extensions, you need a lot more tokens than English for text generation.
24
 
25
  ### Dataset
26
 
 
32
 
33
 
34
  ## How to Get Started with the Model
35
+
36
+ ### Code snippets
37
  ```python
38
  ### Load model
39
  import torch
 
69
  ### Inference
70
  FastLanguageModel.for_inference(model) # Enable native 2x faster inference
71
 
72
+ # 1st example
73
  messages = [
74
  {"from": "human", "value": "Continue the fibonnaci sequence in Korean: 1, 1, 2, 3, 5, 8,"},
75
  {"from": "assistant", "value": "ν”Όλ³΄λ‚˜μΉ˜ μˆ˜μ—΄μ˜ λ‹€μŒ μˆ«μžλŠ” 13, 21, 34, 55, 89 λ“±μž…λ‹ˆλ‹€. 각 μˆ«μžλŠ” μ•žμ˜ 두 숫자의 ν•©μž…λ‹ˆλ‹€."},
 
85
  text_streamer = TextStreamer(tokenizer)
86
  _ = model.generate(input_ids = inputs, streamer = text_streamer, **params)
87
 
88
+ # 2nd example
89
  messages = [
90
  {"from": "human", "value": "What is Machine Learning in Korean?"},
91
  {"from": "assistant", "value": "인곡지λŠ₯의 ν•œ λΆ„μ•Όλ‘œ λ°©λŒ€ν•œ 데이터λ₯Ό 뢄석해 ν–₯ν›„ νŒ¨ν„΄μ„ μ˜ˆμΈ‘ν•˜λŠ” κΈ°λ²•μž…λ‹ˆλ‹€."},
 
103
  _ = model.generate(input_ids = inputs, streamer = text_streamer, **params)
104
  ```
105
 
106
+ ### Inference results
107
+ ```
108
+ # 1st example
109
+ <s><|user|> Continue the fibonnaci sequence in Korean: 1, 1, 2, 3, 5, 8,<|end|><|assistant|> ν”Όλ³΄λ‚˜μΉ˜ μˆ˜μ—΄μ˜ λ‹€μŒ μˆ«μžλŠ” 13, 21, 34, 55, 89 λ“±μž…λ‹ˆλ‹€. 각 μˆ«μžλŠ” μ•žμ˜ 두 숫자의 ν•©μž…λ‹ˆλ‹€.<|end|><|user|> Compute 2x+3=12 in Korean<|end|><|assistant|> 방정식 2x + 3 = 12μ—μ„œ xλ₯Ό ν’€λ €λ©΄ λ‹€μŒ 단계λ₯Ό λ”°λ₯΄μ‹­μ‹œμ˜€.
110
+
111
+ 1. λ°©μ •μ‹μ˜ μ–‘μͺ½μ—μ„œ 3을 λΉΌμ„œ λ°©μ •μ‹μ˜ ν•œμͺ½μ— λŒ€ν•΄ xλ₯Ό λΆ„λ¦¬ν•©λ‹ˆλ‹€.
112
+
113
+ 2x + 3 - 3 = 12 - 3
114
+
115
+ 2x = 9
116
+
117
+ 2. 이제 λ°©μ •μ‹μ˜ μ–‘μͺ½μ„ 2둜 λ‚˜λˆ„μ–΄ x의 값을 κ΅¬ν•©λ‹ˆλ‹€.
118
+
119
+ 2λ°° / 2 = 9 / 2
120
+
121
+ x = 4.5
122
+
123
+ λ”°λΌμ„œ 방정식 2x + 3 = 12에 λŒ€ν•œ ν•΄λŠ” x = 4.5μž…λ‹ˆλ‹€.<|end|>
124
+
125
+ # 2nd example
126
+ <s><|user|> What is Machine Learning in Korean?<|end|><|assistant|> 인곡지λŠ₯의 ν•œ λΆ„μ•Όλ‘œ λ°©λŒ€ν•œ 데이터λ₯Ό 뢄석해 ν–₯ν›„ νŒ¨ν„΄μ„ μ˜ˆμΈ‘ν•˜λŠ” κΈ°λ²•μž…λ‹ˆλ‹€.<|end|><|user|> What is Deep Learning in Korean?<|end|><|assistant|> λ³΅μž‘ν•œ 데이터 μ„ΈνŠΈλ₯Ό λΆ„μ„ν•˜κ³  λ³΅μž‘ν•œ νŒ¨ν„΄μ„ μΈμ‹ν•˜κ³  ν•™μŠ΅ν•˜λŠ” 데 μ‚¬μš©λ˜λŠ” λ”₯λŸ¬λ‹μ€ λ§Žμ€ λ ˆμ΄μ–΄λ‘œ κ΅¬μ„±λœ μ‹ κ²½λ§μ˜ ν•˜μœ„ μ§‘ν•©μž…λ‹ˆλ‹€. 이 κΈ°μˆ μ€ 이미지 인식, μžμ—°μ–΄ 처리 및 자율 μš΄μ „κ³Ό 같은 λ‹€μ–‘ν•œ μ‘μš© λΆ„μ•Όμ—μ„œ 큰 λ°œμ „μ„ μ΄λ€˜μŠ΅λ‹ˆλ‹€.<|end|>
127
+ ```
128
+
129
  ### References
130
  - Base model: [microsoft/phi-2](https://huggingface.co/microsoft/phi-2)
131