Shouldn't CodeLlama 34B have 16K context and rope_theta 1M?

#3
by TheBloke - opened
Cognitive Computations org
No description provided.
ehartford changed pull request status to merged

Sign up or log in to comment