Why llama2 and not mistral 7b?

#1
by Kernel - opened

Can you also fine-tune mixtral 8x7b? That is gonna be mind-blowing

Yellow.ai org

Hello @Kernel , we experimented with Mixtral for one of the downstream task, but unfortunately, it didn't perform as well as LLAMA2. That's why we opted for LLAMA2. However, we appreciate your suggestion and will certainly consider trying Mixtral for the next version.

louisowen6 changed discussion status to closed

Sign up or log in to comment