File size: 463 Bytes
9ef04aa
 
 
 
 
 
24965b6
 
 
9ef04aa
1
2
3
4
5
6
7
8
9
10
---
license: apache-2.0
language:
- zh
- en
---
# Chinese-CodeLlama-7B-SFT-V2

We added 7k+ Python code instructions and implemented SFT based on our [Chinese-CodeLlama-7B-SFT-V1](https://huggingface.co/frankminors123/Chinese-CodeLlama-7B-SFT-V1). Drawing on the work of [code-llama](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/), we increased 
the base period of rotary positional embeddings (RoPE) from 10000 to 1000000.