metadata
base_model:
- rAIfle/Acolyte-22B
quantized_by: Brioch
base_model_relation: quantized
pipeline_tag: text-generation
6.5 bpw EXL2 quant of Acolyte-22B
Acolyte-22B
LoRA of a bunch of random datasets on top of Mistral-Small-Instruct-2409, then SLERPed onto base at 0.5. Decent enough for its size. Check the LoRA for dataset info.
Use Mistral V2 & V3
template.