File size: 1,144 Bytes
cac9284
 
a86c8af
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
cac9284
 
 
a86c8af
cac9284
a86c8af
 
 
 
 
cac9284
a86c8af
cac9284
a86c8af
 
 
 
 
 
cac9284
a86c8af
 
 
 
 
 
cac9284
a86c8af
cac9284
a86c8af
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
---
library_name: transformers
language:
- ko
license: gemma
tags:
- gemma
- pytorch
- instruct
- finetune
- translation
widget:
  - messages:
      - role: user
        content: "Translate into English:Hamsters don't eat cats."

base_model: google/gemma-1.1-2b-it
datasets:
- traintogpb/aihub-flores-koen-integrated-sparta-30k
pipeline_tag: text-generation
---


# Gemma 2B Translation v0.131

- Eval  Loss: `0.99568`
- Train Loss: `0.88993`
- lr: `6e-05`
- optimizer: adamw
- lr_scheduler_type: cosine

## Prompt Template

```
<bos><start_of_turn>user
Translate into English:Hamsters don't eat cats.<end_of_turn>
<start_of_turn>model
ํ–„์Šคํ„ฐ๋Š” ๊ณ ์–‘์ด๋ฅผ ๋จน์ง€ ์•Š์Šต๋‹ˆ๋‹ค.<eos>
```

```
<bos><start_of_turn>user
Translate into English:ํ–„์Šคํ„ฐ๋Š” ๊ณ ์–‘์ด๋ฅผ ๋จน์ง€ ์•Š์Šต๋‹ˆ๋‹ค.<end_of_turn>
<start_of_turn>model
Hamsters do not eat cats.<eos>
```

## Model Description

- **Developed by:** `lemon-mint`
- **Model type:** Gemma
- **Language(s) (NLP):** English
- **License:** [gemma-terms-of-use](https://ai.google.dev/gemma/terms)
- **Finetuned from model:** [google/gemma-1.1-2b-it](https://huggingface.co/google/gemma-1.1-2b-it)