葉佐俊
win10
AI & ML interests
None yet
Organizations
win10's activity
感覺可以用在phi3.5?
5
#1 opened 4 days ago
by
win10
How did you convert it?
8
#1 opened 7 days ago
by
win10
Request:Multi-language dataset
#46 opened 11 days ago
by
win10
We are a scam team!!!!
#262 opened 11 days ago
by
win10
We are a scam team!!!!
#263 opened 11 days ago
by
win10
嘗試看看基於joy-caption開發
#1 opened 19 days ago
by
win10
Request TheDrummer/Star-Command-R-32B-v1
1
#1 opened 21 days ago
by
win10
hi you can try Uses fp16 for embed and output weights?
#1 opened 28 days ago
by
win10
你好!我想問一下能出一個教學嗎?
3
#1 opened about 1 month ago
by
win10
Request model:mistral-nemo
#2 opened about 1 month ago
by
win10
使用多少token訓練
4
#1 opened 3 months ago
by
win10
Will you release the training code ?
2
#12 opened 3 months ago
by
win10
會開源預訓練程序嗎?
#19 opened 3 months ago
by
win10
Will it be available in multiple languages?
1
#1 opened 4 months ago
by
win10
這是我合併的13B基礎模型(7B到13b)
2
#2 opened 4 months ago
by
win10
Will you merge llama3 8b and Mistral-7B-v0.3 into a base model, use the llama3 tokenizer, and train?
2
#1 opened 4 months ago
by
win10
想請問一下會有int4版本嗎?
#2 opened 4 months ago
by
win10
Can you add an fp8 or int4 quantization loader?
1
#3 opened 4 months ago
by
win10
What do you think is better out of the box?
#1 opened 5 months ago
by
win10
taide-meta-it-16b合併模型
#10 opened 5 months ago
by
win10
Great Model
21
#1 opened 5 months ago
by
spike4379
mixtral format?
5
#1 opened 5 months ago
by
KnutJaegersberg
提議:官方放到ollama上
2
#1 opened 5 months ago
by
win10
Hi! Friends in the open source community, will you open source?
4
#1 opened 7 months ago
by
win10
Hi, will there be an instruction version of the coding model in the future?
#2 opened 7 months ago
by
win10
这是很棒的模型但有个问题
1
#2 opened 8 months ago
by
win10
之後會有從頭開始預訓練的模型嗎?
2
#1 opened 8 months ago
by
win10
Can you please fine-tune the RWKV-5 7B?
1
#2 opened 8 months ago
by
win10
Will there be a version with traditional Chinese in the future?
#5 opened 8 months ago
by
win10
Can I ask you to try training TinyDolphin-2.8-1.1b or create multilingual data?
1
#9 opened 9 months ago
by
win10
A more complex script generated by gpt4-turbo
1
#8 opened 9 months ago
by
win10
The number of samples is 5,000,000, from mathrandom2.py.
1
#6 opened 9 months ago
by
win10
A more complex script generated by gpt4-turbo
1
#3 opened 9 months ago
by
win10
A more complex script generated by gpt4-turbo
#4 opened 9 months ago
by
win10
A more complex script generated by gpt4-turbo
#2 opened 9 months ago
by
win10
您是怎么合并模型的呢?
#2 opened 9 months ago
by
win10
Can you fine tune tinyllama1.1b-3t?
3
#1 opened 9 months ago
by
win10
You should try training a model with 2B parameters and context length 32000.
1
#3 opened 9 months ago
by
win10
您们能尝试训练开源的芯片生成模型吗?
#2 opened 9 months ago
by
win10
会有新的模型吗?
6
#1 opened 10 months ago
by
win10
Will there be a Chinese version of the model later?
#2 opened 10 months ago
by
win10
How did you merge the models?
5
#1 opened 10 months ago
by
win10
Please, you make a 6x1b moe model?
1
#1 opened 10 months ago
by
win10
Can you make a model that can run without quantization on a gpu with only 8g vram?
1
#2 opened 10 months ago
by
win10
What data set did you use?
#1 opened 10 months ago
by
win10
Are there any plans for new Fine-tuning or will it be released only after the 3 trillion tokens version?
#3 opened 10 months ago
by
win10
Nice work!
7
#3 opened 10 months ago
by
BrainSlugs83
Request for Open Sourcing a Chat-Chip Model for Chip Design
#10 opened 11 months ago
by
win10
hello, I would like to ask you how to merge the 20B model?
12
#1 opened 11 months ago
by
win10
How to convert to gguf format?
#5 opened 11 months ago
by
win10
Can you do the same thing with Yi-6b-200k?
3
#6 opened 11 months ago
by
win10
How to convert the llama2.c model into huggingface model format?
2
#4 opened 12 months ago
by
win10
您能嘗試使用开放源码訓練模型嗎?
2
#2 opened about 1 year ago
by
win10
您好能打包成zip文件嗎?
1
#2 opened about 1 year ago
by
win10
有增量訓練分子生成或創造模型的計畫
5
#1 opened about 1 year ago
by
win10
New activity in
xiaol/RWKV-toolformer-translation-japanese-chinese-english-7B-World-128k
about 1 year ago
您們能嘗試增強RWKV的編程能力嗎?
1
#1 opened about 1 year ago
by
win10
你好,你能嘗試在保留模型原本能力的同時,訓練模型能生成化學分子與分析嗎?
7
#1 opened about 1 year ago
by
win10
Can this dataset support effective 65K contextual training?
#2 opened about 1 year ago
by
win10