diff --git a/README.md b/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..3f641b0eafcb9b1490342a939783c169b51b3b42
--- /dev/null
+++ b/README.md
@@ -0,0 +1,485 @@
+---
+license: apache-2.0
+base_model: mistralai/Mistral-Nemo-Base-2407
+tags:
+- generated_from_trainer
+- axolotl
+datasets:
+- cognitivecomputations/Dolphin-2.9
+- teknium/OpenHermes-2.5
+- m-a-p/CodeFeedback-Filtered-Instruction
+- cognitivecomputations/dolphin-coder
+- cognitivecomputations/samantha-data
+- microsoft/orca-math-word-problems-200k
+- Locutusque/function-calling-chatml
+- internlm/Agent-FLAN
+---
+
+# Dolphin 2.9.3 Mistral Nemo 12b 🐬
+
+Curated and trained by Eric Hartford and Cognitive Computations
+
+[![Discord](https://img.shields.io/discord/1156064224225808488?logo=Discord&logoColor=%23ffffff&label=Discord&link=https%3A%2F%2Fdiscord.gg%2FtCMkMDDHwm)](https://discord.gg/h3K4XGj2RH)
+Discord: https://discord.gg/h3K4XGj2RH
+
+
+
+Our appreciation for the sponsors of Dolphin 2.9.3:
+- [Crusoe Cloud](https://crusoe.ai/) - provided excellent on-demand 8xL40S node
+
+This model is based on mistralai/Mistral-Nemo-Base-2407, and is governed by the apache 2.0 license.
+
+The base model has 128K context, and our finetuning used 8192 sequence length.
+
+Dolphin 2.9.3 uses ChatML prompt template format.
+
+example:
+
+```
+<|im_start|>system
+You are Dolphin, a helpful AI assistant.<|im_end|>
+<|im_start|>user
+{prompt}<|im_end|>
+<|im_start|>assistant
+
+```
+
+Dolphin-2.9.3 has a variety of instruction following, conversational, and coding skills. It also has initial agentic abilities and supports function calling.
+
+Dolphin is uncensored. We have filtered the dataset to remove alignment and bias. This makes the model more compliant. You are advised to implement your own alignment layer before exposing the model as a service. It will be highly compliant with any requests, even unethical ones. Please read my blog post about uncensored models. https://erichartford.com/uncensored-models You are responsible for any content you create using this model. Enjoy responsibly.
+
+Dolphin is licensed according to apache 2.0 license. We grant permission for any use, including commercial. Dolphin was trained on data generated from GPT4, among other models.
+
+## Evals
+
+See evals
+
+```
+| Tasks |Version|Filter|n-shot| Metric | |Value | |Stderr|
+|-----------------------------------------------------------|-------|------|-----:|-----------------------|---|-----:|---|------|
+|leaderboard |N/A |none | 0|acc |↑ |0.3437|± |0.0043|
+| | |none | 0|acc_norm |↑ |0.5076|± |0.0053|
+| | |none | 0|exact_match |↑ |0.0536|± |0.0061|
+| | |none | 0|inst_level_loose_acc |↑ |0.4388|± |N/A |
+| | |none | 0|inst_level_strict_acc |↑ |0.3741|± |N/A |
+| | |none | 0|prompt_level_loose_acc |↑ |0.3105|± |0.0199|
+| | |none | 0|prompt_level_strict_acc|↑ |0.2477|± |0.0186|
+| - leaderboard_bbh |N/A |none | 3|acc_norm |↑ |0.5549|± |0.0061|
+| - leaderboard_bbh_boolean_expressions | 0|none | 3|acc_norm |↑ |0.8640|± |0.0217|
+| - leaderboard_bbh_causal_judgement | 0|none | 3|acc_norm |↑ |0.6417|± |0.0352|
+| - leaderboard_bbh_date_understanding | 0|none | 3|acc_norm |↑ |0.6080|± |0.0309|
+| - leaderboard_bbh_disambiguation_qa | 0|none | 3|acc_norm |↑ |0.6480|± |0.0303|
+| - leaderboard_bbh_formal_fallacies | 0|none | 3|acc_norm |↑ |0.5360|± |0.0316|
+| - leaderboard_bbh_geometric_shapes | 0|none | 3|acc_norm |↑ |0.5240|± |0.0316|
+| - leaderboard_bbh_hyperbaton | 0|none | 3|acc_norm |↑ |0.6440|± |0.0303|
+| - leaderboard_bbh_logical_deduction_five_objects | 0|none | 3|acc_norm |↑ |0.4600|± |0.0316|
+| - leaderboard_bbh_logical_deduction_seven_objects | 0|none | 3|acc_norm |↑ |0.4680|± |0.0316|
+| - leaderboard_bbh_logical_deduction_three_objects | 0|none | 3|acc_norm |↑ |0.7000|± |0.0290|
+| - leaderboard_bbh_movie_recommendation | 0|none | 3|acc_norm |↑ |0.8160|± |0.0246|
+| - leaderboard_bbh_navigate | 0|none | 3|acc_norm |↑ |0.6040|± |0.0310|
+| - leaderboard_bbh_object_counting | 0|none | 3|acc_norm |↑ |0.3680|± |0.0306|
+| - leaderboard_bbh_penguins_in_a_table | 0|none | 3|acc_norm |↑ |0.5548|± |0.0413|
+| - leaderboard_bbh_reasoning_about_colored_objects | 0|none | 3|acc_norm |↑ |0.6320|± |0.0306|
+| - leaderboard_bbh_ruin_names | 0|none | 3|acc_norm |↑ |0.7440|± |0.0277|
+| - leaderboard_bbh_salient_translation_error_detection | 0|none | 3|acc_norm |↑ |0.5280|± |0.0316|
+| - leaderboard_bbh_snarks | 0|none | 3|acc_norm |↑ |0.6292|± |0.0363|
+| - leaderboard_bbh_sports_understanding | 0|none | 3|acc_norm |↑ |0.8040|± |0.0252|
+| - leaderboard_bbh_temporal_sequences | 0|none | 3|acc_norm |↑ |0.4680|± |0.0316|
+| - leaderboard_bbh_tracking_shuffled_objects_five_objects | 0|none | 3|acc_norm |↑ |0.2160|± |0.0261|
+| - leaderboard_bbh_tracking_shuffled_objects_seven_objects| 0|none | 3|acc_norm |↑ |0.1160|± |0.0203|
+| - leaderboard_bbh_tracking_shuffled_objects_three_objects| 0|none | 3|acc_norm |↑ |0.3000|± |0.0290|
+| - leaderboard_bbh_web_of_lies | 0|none | 3|acc_norm |↑ |0.4880|± |0.0317|
+| - leaderboard_gpqa |N/A |none | 0|acc_norm |↑ |0.3146|± |0.0135|
+| - leaderboard_gpqa_diamond | 1|none | 0|acc_norm |↑ |0.3182|± |0.0332|
+| - leaderboard_gpqa_extended | 1|none | 0|acc_norm |↑ |0.3187|± |0.0200|
+| - leaderboard_gpqa_main | 1|none | 0|acc_norm |↑ |0.3080|± |0.0218|
+| - leaderboard_ifeval | 2|none | 0|inst_level_loose_acc |↑ |0.4388|± |N/A |
+| | |none | 0|inst_level_strict_acc |↑ |0.3741|± |N/A |
+| | |none | 0|prompt_level_loose_acc |↑ |0.3105|± |0.0199|
+| | |none | 0|prompt_level_strict_acc|↑ |0.2477|± |0.0186|
+| - leaderboard_math_algebra_hard | 1|none | 4|exact_match |↑ |0.0749|± |0.0150|
+| - leaderboard_math_counting_and_prob_hard | 1|none | 4|exact_match |↑ |0.0244|± |0.0140|
+| - leaderboard_math_geometry_hard | 1|none | 4|exact_match |↑ |0.0227|± |0.0130|
+| - leaderboard_math_hard |N/A |none | 4|exact_match |↑ |0.0536|± |0.0061|
+| - leaderboard_math_intermediate_algebra_hard | 1|none | 4|exact_match |↑ |0.0250|± |0.0093|
+| - leaderboard_math_num_theory_hard | 1|none | 4|exact_match |↑ |0.0390|± |0.0156|
+| - leaderboard_math_prealgebra_hard | 1|none | 4|exact_match |↑ |0.1295|± |0.0242|
+| - leaderboard_math_precalculus_hard | 1|none | 4|exact_match |↑ |0.0296|± |0.0146|
+| - leaderboard_mmlu_pro | 0.1|none | 5|acc |↑ |0.3437|± |0.0043|
+| - leaderboard_musr |N/A |none | 0|acc_norm |↑ |0.4511|± |0.0178|
+| - leaderboard_musr_murder_mysteries | 1|none | 0|acc_norm |↑ |0.5880|± |0.0312|
+| - leaderboard_musr_object_placements | 1|none | 0|acc_norm |↑ |0.3438|± |0.0297|
+| - leaderboard_musr_team_allocation | 1|none | 0|acc_norm |↑ |0.4240|± |0.0313|
+
+| Groups |Version|Filter|n-shot| Metric | |Value | |Stderr|
+|------------------------|-------|------|-----:|-----------------------|---|-----:|---|------|
+|leaderboard |N/A |none | 0|acc |↑ |0.3437|± |0.0043|
+| | |none | 0|acc_norm |↑ |0.5076|± |0.0053|
+| | |none | 0|exact_match |↑ |0.0536|± |0.0061|
+| | |none | 0|inst_level_loose_acc |↑ |0.4388|± |N/A |
+| | |none | 0|inst_level_strict_acc |↑ |0.3741|± |N/A |
+| | |none | 0|prompt_level_loose_acc |↑ |0.3105|± |0.0199|
+| | |none | 0|prompt_level_strict_acc|↑ |0.2477|± |0.0186|
+| - leaderboard_bbh |N/A |none | 3|acc_norm |↑ |0.5549|± |0.0061|
+| - leaderboard_gpqa |N/A |none | 0|acc_norm |↑ |0.3146|± |0.0135|
+| - leaderboard_math_hard|N/A |none | 4|exact_match |↑ |0.0536|± |0.0061|
+| - leaderboard_musr |N/A |none | 0|acc_norm |↑ |0.4511|± |0.0178|
+```
+
+
+
+## Training
+
+
+
+[](https://github.com/axolotl-ai-cloud/axolotl)
+See axolotl config
+
+axolotl version: `0.4.1`
+```yaml
+base_model: /workspace/models/Mistral-Nemo-Base-2407
+model_type: AutoModelForCausalLM
+tokenizer_type: AutoTokenizer
+
+load_in_8bit: false
+# load_in_4bit: true
+strict: false
+
+datasets:
+ - path: /workspace/datasets/dolphin-2.9.3/dolphin201-sharegpt2.jsonl
+ type: sharegpt
+ conversation: chatml
+ - path: /workspace/datasets/dolphin-2.9.3/SystemChat_filtered_sharegpt.jsonl
+ type: sharegpt
+ conversation: chatml
+ - path: /workspace/datasets/dolphin-2.9.3/SystemChat_multilingual_sharegpt.jsonl
+ type: sharegpt
+ conversation: chatml
+ - path: /workspace/datasets/dolphin-2.9.3/dolphin-coder-translate-sharegpt2.jsonl
+ type: sharegpt
+ conversation: chatml
+ - path: /workspace/datasets/dolphin-2.9.3/dolphin-coder-codegen-sharegpt2.jsonl
+ type: sharegpt
+ conversation: chatml
+ - path: /workspace/datasets/dolphin-2.9.3/m-a-p_Code-Feedback-sharegpt-unfiltered.jsonl
+ type: sharegpt
+ conversation: chatml
+ - path: /workspace/datasets/dolphin-2.9.3/m-a-p_CodeFeedback-Filtered-Instruction-sharegpt-unfiltered.jsonl
+ type: sharegpt
+ conversation: chatml
+ - path: /workspace/datasets/dolphin-2.9.3/not_samantha_norefusals.jsonl
+ type: sharegpt
+ conversation: chatml
+ - path: /workspace/datasets/dolphin-2.9.3/Orca-Math-resort-unfiltered.jsonl
+ type: sharegpt
+ conversation: chatml
+ - path: /workspace/datasets/dolphin-2.9.3/agent_instruct_react_unfiltered.jsonl
+ type: sharegpt
+ conversation: chatml
+ - path: /workspace/datasets/dolphin-2.9.3/toolbench_instruct_j1s1_3k_unfiltered.jsonl
+ type: sharegpt
+ conversation: chatml
+ - path: /workspace/datasets/dolphin-2.9.3/toolbench_negative_unfiltered.jsonl
+ type: sharegpt
+ conversation: chatml
+ - path: /workspace/datasets/dolphin-2.9.3/toolbench_react_10p_unfiltered.jsonl
+ type: sharegpt
+ conversation: chatml
+ - path: /workspace/datasets/dolphin-2.9.3/toolbench_tflan_cot_30p_unfiltered.jsonl
+ type: sharegpt
+ conversation: chatml
+ - path: /workspace/datasets/dolphin-2.9.3/openhermes200k_unfiltered.jsonl
+ type: sharegpt
+ conversation: chatml
+
+chat_template: chatml
+# adapter: qlora
+# lora_r: 128
+# lora_alpha: 16
+# lora_modules_to_save: [embed_tokens, lm_head]
+# lora_dropout: 0.05
+# lora_target_linear: true
+
+
+unfrozen_parameters:
+- ^lm_head.weight$
+- ^model.embed_tokens.weight$
+- input_layernorm
+- model.norm
+- post_attention_layernorm
+- self_attn.rotary_emb
+# mlp.down_proj layers
+- model.layers.0.mlp.down_proj
+- model.layers.1.mlp.down_proj
+- model.layers.4.mlp.down_proj
+- model.layers.37.mlp.down_proj
+- model.layers.24.mlp.down_proj
+- model.layers.2.mlp.down_proj
+- model.layers.38.mlp.down_proj
+- model.layers.35.mlp.down_proj
+- model.layers.25.mlp.down_proj
+- model.layers.6.mlp.down_proj
+- model.layers.22.mlp.down_proj
+- model.layers.23.mlp.down_proj
+- model.layers.3.mlp.down_proj
+- model.layers.21.mlp.down_proj
+- model.layers.5.mlp.down_proj
+- model.layers.28.mlp.down_proj
+- model.layers.20.mlp.down_proj
+- model.layers.26.mlp.down_proj
+- model.layers.19.mlp.down_proj
+- model.layers.34.mlp.down_proj
+# mlp.gate_proj layers
+- model.layers.2.mlp.gate_proj
+- model.layers.1.mlp.gate_proj
+- model.layers.3.mlp.gate_proj
+- model.layers.5.mlp.gate_proj
+- model.layers.4.mlp.gate_proj
+- model.layers.35.mlp.gate_proj
+- model.layers.36.mlp.gate_proj
+- model.layers.37.mlp.gate_proj
+- model.layers.38.mlp.gate_proj
+- model.layers.34.mlp.gate_proj
+- model.layers.33.mlp.gate_proj
+- model.layers.8.mlp.gate_proj
+- model.layers.32.mlp.gate_proj
+- model.layers.6.mlp.gate_proj
+- model.layers.28.mlp.gate_proj
+- model.layers.26.mlp.gate_proj
+- model.layers.30.mlp.gate_proj
+- model.layers.23.mlp.gate_proj
+- model.layers.29.mlp.gate_proj
+- model.layers.27.mlp.gate_proj
+# mlp.up_proj layers
+- model.layers.3.mlp.up_proj
+- model.layers.4.mlp.up_proj
+- model.layers.6.mlp.up_proj
+- model.layers.2.mlp.up_proj
+- model.layers.5.mlp.up_proj
+- model.layers.8.mlp.up_proj
+- model.layers.10.mlp.up_proj
+- model.layers.9.mlp.up_proj
+- model.layers.7.mlp.up_proj
+- model.layers.0.mlp.up_proj
+- model.layers.17.mlp.up_proj
+- model.layers.15.mlp.up_proj
+- model.layers.22.mlp.up_proj
+- model.layers.18.mlp.up_proj
+- model.layers.16.mlp.up_proj
+- model.layers.11.mlp.up_proj
+- model.layers.21.mlp.up_proj
+- model.layers.23.mlp.up_proj
+- model.layers.20.mlp.up_proj
+- model.layers.27.mlp.up_proj
+# self_attn.k_proj layers
+- model.layers.30.self_attn.k_proj
+- model.layers.27.self_attn.k_proj
+- model.layers.25.self_attn.k_proj
+- model.layers.33.self_attn.k_proj
+- model.layers.26.self_attn.k_proj
+- model.layers.31.self_attn.k_proj
+- model.layers.35.self_attn.k_proj
+- model.layers.39.self_attn.k_proj
+- model.layers.22.self_attn.k_proj
+- model.layers.24.self_attn.k_proj
+- model.layers.21.self_attn.k_proj
+- model.layers.28.self_attn.k_proj
+- model.layers.23.self_attn.k_proj
+- model.layers.36.self_attn.k_proj
+- model.layers.20.self_attn.k_proj
+- model.layers.37.self_attn.k_proj
+- model.layers.29.self_attn.k_proj
+- model.layers.32.self_attn.k_proj
+- model.layers.16.self_attn.k_proj
+- model.layers.18.self_attn.k_proj
+# self_attn.o_proj layers
+- model.layers.7.self_attn.o_proj
+- model.layers.6.self_attn.o_proj
+- model.layers.9.self_attn.o_proj
+- model.layers.5.self_attn.o_proj
+- model.layers.27.self_attn.o_proj
+- model.layers.26.self_attn.o_proj
+- model.layers.4.self_attn.o_proj
+- model.layers.31.self_attn.o_proj
+- model.layers.8.self_attn.o_proj
+- model.layers.16.self_attn.o_proj
+- model.layers.3.self_attn.o_proj
+- model.layers.10.self_attn.o_proj
+- model.layers.18.self_attn.o_proj
+- model.layers.33.self_attn.o_proj
+- model.layers.17.self_attn.o_proj
+- model.layers.32.self_attn.o_proj
+- model.layers.30.self_attn.o_proj
+- model.layers.2.self_attn.o_proj
+- model.layers.15.self_attn.o_proj
+- model.layers.11.self_attn.o_proj
+# self_attn.q_proj layers
+- model.layers.14.self_attn.q_proj
+- model.layers.11.self_attn.q_proj
+- model.layers.15.self_attn.q_proj
+- model.layers.9.self_attn.q_proj
+- model.layers.8.self_attn.q_proj
+- model.layers.18.self_attn.q_proj
+- model.layers.12.self_attn.q_proj
+- model.layers.13.self_attn.q_proj
+- model.layers.19.self_attn.q_proj
+- model.layers.16.self_attn.q_proj
+- model.layers.10.self_attn.q_proj
+- model.layers.17.self_attn.q_proj
+- model.layers.7.self_attn.q_proj
+- model.layers.5.self_attn.q_proj
+- model.layers.20.self_attn.q_proj
+- model.layers.3.self_attn.q_proj
+- model.layers.26.self_attn.q_proj
+- model.layers.27.self_attn.q_proj
+- model.layers.28.self_attn.q_proj
+- model.layers.33.self_attn.q_proj
+# self_attn.v_proj layers
+- model.layers.27.self_attn.v_proj
+- model.layers.20.self_attn.v_proj
+- model.layers.24.self_attn.v_proj
+- model.layers.25.self_attn.v_proj
+- model.layers.30.self_attn.v_proj
+- model.layers.2.self_attn.v_proj
+- model.layers.23.self_attn.v_proj
+- model.layers.22.self_attn.v_proj
+- model.layers.26.self_attn.v_proj
+- model.layers.33.self_attn.v_proj
+- model.layers.37.self_attn.v_proj
+- model.layers.7.self_attn.v_proj
+- model.layers.4.self_attn.v_proj
+- model.layers.18.self_attn.v_proj
+- model.layers.31.self_attn.v_proj
+- model.layers.17.self_attn.v_proj
+- model.layers.35.self_attn.v_proj
+- model.layers.32.self_attn.v_proj
+- model.layers.21.self_attn.v_proj
+- model.layers.3.self_attn.v_proj
+
+
+
+dataset_prepared_path: /workspace/axolotl/dolph-2.9.3-nemo-prepared
+val_set_size: 0.01
+output_dir: /workspace/axolotl/dolphin-2.9.3-mistral-nemo
+
+sequence_len: 8192
+sample_packing: true
+pad_to_sequence_len: true
+
+wandb_project: dolphin-2.9.3-Mistral-nemo
+wandb_watch:
+wandb_run_id:
+wandb_log_model:
+
+gradient_accumulation_steps: 16
+micro_batch_size: 1
+num_epochs: 3
+optimizer: adamw_torch
+lr_scheduler: cosine
+learning_rate: 5e-6
+train_on_inputs: false
+group_by_length: false
+bf16: auto
+fp16:
+tf32:
+
+gradient_checkpointing: true
+gradient_checkpointing_kwargs:
+ use_reentrant: false
+early_stopping_patience:
+resume_from_checkpoint:
+logging_steps: 1
+xformers_attention:
+flash_attention: true
+
+warmup_steps: 100
+# evals_per_epoch: 4
+eval_table_size:
+saves_per_epoch: 1
+save_total_limit: 2
+save_steps:
+debug:
+deepspeed: deepspeed_configs/zero3_bf16.json
+weight_decay: 0.1
+special_tokens:
+ eos_token: "<|im_end|>"
+ pad_token: ""
+ bos_token: ""
+ unk_token: ""
+tokens:
+ - "<|im_start|>"
+
+
+# fsdp:
+# - full_shard
+# - auto_wrap
+# fsdp_config:
+# fsdp_limit_all_gathers: true
+# fsdp_sync_module_states: true
+# fsdp_offload_params: true
+# fsdp_use_orig_params: false
+# fsdp_cpu_ram_efficient_loading: true
+# fsdp_transformer_layer_cls_to_wrap: MixtralSparseMoeBlock
+# fsdp_state_dict_type: FULL_STATE_DICT
+# fsdp_auto_wrap_policy: TRANSFORMER_BASED_WRAP
+# fsdp_sharding_strategy: FULL_SHARD
+# fsdp_forward_prefetch: false
+# fsdp_backward_prefetch: BACKWARD_PRE
+```
+
+
+
+[](https://wandb.ai/ehartford/dolphin-2.9.3-Mistral-nemo/runs/c23odyoj)
+# workspace/axolotl/dolphin-2.9.3-mistral-nemo
+
+This model was trained from scratch on the None dataset.
+It achieves the following results on the evaluation set:
+- Loss: 0.5605
+
+## Model description
+
+More information needed
+
+## Intended uses & limitations
+
+More information needed
+
+## Training and evaluation data
+
+More information needed
+
+## Training procedure
+
+### Training hyperparameters
+
+The following hyperparameters were used during training:
+- learning_rate: 5e-06
+- train_batch_size: 1
+- eval_batch_size: 1
+- seed: 42
+- distributed_type: multi-GPU
+- num_devices: 8
+- gradient_accumulation_steps: 16
+- total_train_batch_size: 128
+- total_eval_batch_size: 8
+- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
+- lr_scheduler_type: cosine
+- lr_scheduler_warmup_steps: 100
+- num_epochs: 3
+
+### Training results
+
+| Training Loss | Epoch | Step | Validation Loss |
+|:-------------:|:------:|:----:|:---------------:|
+| 0.5691 | 1.0162 | 983 | 0.5734 |
+| 0.5335 | 2.0174 | 1968 | 0.5609 |
+| 0.5297 | 2.9639 | 2901 | 0.5605 |
+
+
+### Framework versions
+
+- Transformers 4.43.0.dev0
+- Pytorch 2.2.2+cu121
+- Datasets 2.19.1
+- Tokenizers 0.19.1
diff --git a/added_tokens.json b/added_tokens.json
new file mode 100644
index 0000000000000000000000000000000000000000..92a2d8613f37d1b178ded5ea1464a2cb91bb052e
--- /dev/null
+++ b/added_tokens.json
@@ -0,0 +1,4 @@
+{
+ "<|im_end|>": 131072,
+ "<|im_start|>": 131073
+}
diff --git a/config.json b/config.json
new file mode 100644
index 0000000000000000000000000000000000000000..92651998896da02e41337f3a4ab3e87b2b90d345
--- /dev/null
+++ b/config.json
@@ -0,0 +1,38 @@
+{
+ "_name_or_path": "mistralai/Mistral-Nemo-Base-2407",
+ "architectures": [
+ "MistralForCausalLM"
+ ],
+ "attention_dropout": 0.0,
+ "bos_token_id": 1,
+ "eos_token_id": 131072,
+ "head_dim": 128,
+ "hidden_act": "silu",
+ "hidden_size": 5120,
+ "initializer_range": 0.02,
+ "intermediate_size": 14336,
+ "max_position_embeddings": 1024000,
+ "model_type": "mistral",
+ "num_attention_heads": 32,
+ "num_hidden_layers": 40,
+ "num_key_value_heads": 8,
+ "rms_norm_eps": 1e-05,
+ "rope_theta": 1000000.0,
+ "sliding_window": null,
+ "tie_word_embeddings": false,
+ "torch_dtype": "bfloat16",
+ "transformers_version": "4.43.0.dev0",
+ "use_cache": false,
+ "vocab_size": 131074,
+ "quantization_config": {
+ "quant_method": "exl2",
+ "version": "0.2.2",
+ "bits": 4.5,
+ "head_bits": 8,
+ "calibration": {
+ "rows": 115,
+ "length": 2048,
+ "dataset": "(default)"
+ }
+ }
+}
\ No newline at end of file
diff --git a/generation_config.json b/generation_config.json
new file mode 100644
index 0000000000000000000000000000000000000000..508dba5db2b437a5b2e6c84c20aed6c4677169a3
--- /dev/null
+++ b/generation_config.json
@@ -0,0 +1,7 @@
+{
+ "_from_model_config": true,
+ "bos_token_id": 1,
+ "do_sample": true,
+ "eos_token_id": 2,
+ "transformers_version": "4.43.0.dev0"
+}
diff --git a/model.safetensors.index.json b/model.safetensors.index.json
new file mode 100644
index 0000000000000000000000000000000000000000..fdfdc1caa5439bdcfe81e6586261cf76e37d8f8a
--- /dev/null
+++ b/model.safetensors.index.json
@@ -0,0 +1,370 @@
+{
+ "metadata": {
+ "total_size": 24495605760
+ },
+ "weight_map": {
+ "lm_head.weight": "model-00005-of-00005.safetensors",
+ "model.embed_tokens.weight": "model-00001-of-00005.safetensors",
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00005.safetensors",
+ "model.layers.0.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.0.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00005.safetensors",
+ "model.layers.1.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.1.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.10.input_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.10.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.10.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.10.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.10.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.10.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.10.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.10.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.10.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.11.input_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.11.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.11.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.11.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.11.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.11.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.11.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.11.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.11.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.12.input_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.12.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.12.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.12.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.12.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.12.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.12.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.12.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.12.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.13.input_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.13.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.13.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.13.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.13.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.13.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.13.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.13.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.13.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.14.input_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.14.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.14.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.14.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.14.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.14.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.14.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.14.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.14.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.15.input_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.15.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.15.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.15.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.15.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.15.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.15.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.15.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.15.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.16.input_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.16.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.16.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.16.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.16.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.16.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.16.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.16.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.16.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.17.input_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.17.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.17.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.17.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.17.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.17.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.17.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.17.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.17.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.18.input_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.18.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.18.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.18.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.18.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.18.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.18.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.18.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.18.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.19.input_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.19.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.19.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.19.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.19.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.19.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.19.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.19.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.19.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00005.safetensors",
+ "model.layers.2.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.2.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.2.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.20.input_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.20.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.20.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.20.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.20.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.20.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.20.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.20.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.20.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.21.input_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.21.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.21.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.21.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.21.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.21.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.21.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.21.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.21.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.22.input_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.22.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.22.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.22.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.22.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.22.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.22.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.22.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.22.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.23.input_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.23.mlp.down_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.23.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.23.mlp.up_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.23.post_attention_layernorm.weight": "model-00003-of-00005.safetensors",
+ "model.layers.23.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.23.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.23.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.23.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.24.input_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.24.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.24.mlp.gate_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.24.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.24.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.24.self_attn.k_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.24.self_attn.o_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.24.self_attn.q_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.24.self_attn.v_proj.weight": "model-00003-of-00005.safetensors",
+ "model.layers.25.input_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.25.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.25.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.25.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.25.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.25.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.25.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.25.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.25.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.26.input_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.26.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.26.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.26.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.26.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.26.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.26.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.26.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.26.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.27.input_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.27.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.27.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.27.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.27.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.27.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.27.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.27.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.27.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.28.input_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.28.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.28.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.28.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.28.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.28.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.28.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.28.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.28.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.29.input_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.29.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.29.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.29.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.29.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.29.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.29.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.29.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.29.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.3.input_layernorm.weight": "model-00001-of-00005.safetensors",
+ "model.layers.3.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.3.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.3.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
+ "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.30.input_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.30.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.30.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.30.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.30.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.30.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.30.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.30.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.30.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.31.input_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.31.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.31.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.31.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.31.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.31.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.31.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.31.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.31.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.32.input_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.32.mlp.down_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.32.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.32.mlp.up_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.32.post_attention_layernorm.weight": "model-00004-of-00005.safetensors",
+ "model.layers.32.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.32.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.32.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.32.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.33.input_layernorm.weight": "model-00005-of-00005.safetensors",
+ "model.layers.33.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.33.mlp.gate_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.33.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.33.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
+ "model.layers.33.self_attn.k_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.33.self_attn.o_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.33.self_attn.q_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.33.self_attn.v_proj.weight": "model-00004-of-00005.safetensors",
+ "model.layers.34.input_layernorm.weight": "model-00005-of-00005.safetensors",
+ "model.layers.34.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.34.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.34.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.34.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
+ "model.layers.34.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.34.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.34.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.34.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.35.input_layernorm.weight": "model-00005-of-00005.safetensors",
+ "model.layers.35.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.35.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.35.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.35.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
+ "model.layers.35.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.35.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.35.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.35.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.36.input_layernorm.weight": "model-00005-of-00005.safetensors",
+ "model.layers.36.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.36.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.36.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.36.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
+ "model.layers.36.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.36.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.36.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.36.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.37.input_layernorm.weight": "model-00005-of-00005.safetensors",
+ "model.layers.37.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.37.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.37.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.37.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
+ "model.layers.37.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.37.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.37.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.37.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.38.input_layernorm.weight": "model-00005-of-00005.safetensors",
+ "model.layers.38.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.38.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.38.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.38.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
+ "model.layers.38.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.38.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.38.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.38.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.39.input_layernorm.weight": "model-00005-of-00005.safetensors",
+ "model.layers.39.mlp.down_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.39.mlp.gate_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.39.mlp.up_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.39.post_attention_layernorm.weight": "model-00005-of-00005.safetensors",
+ "model.layers.39.self_attn.k_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.39.self_attn.o_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.39.self_attn.q_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.39.self_attn.v_proj.weight": "model-00005-of-00005.safetensors",
+ "model.layers.4.input_layernorm.weight": "model-00001-of-00005.safetensors",
+ "model.layers.4.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.4.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.4.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.4.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
+ "model.layers.4.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.4.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.4.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.5.input_layernorm.weight": "model-00001-of-00005.safetensors",
+ "model.layers.5.mlp.down_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.5.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.5.mlp.up_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.5.post_attention_layernorm.weight": "model-00001-of-00005.safetensors",
+ "model.layers.5.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.5.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.5.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.5.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.6.input_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.6.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.6.mlp.gate_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.6.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.6.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.6.self_attn.k_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.6.self_attn.o_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.6.self_attn.q_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.6.self_attn.v_proj.weight": "model-00001-of-00005.safetensors",
+ "model.layers.7.input_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.7.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.7.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.7.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.7.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.7.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.7.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.7.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.7.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.8.input_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.8.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.8.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.8.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.8.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.8.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.8.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.8.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.8.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.9.input_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.9.mlp.down_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.9.mlp.gate_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.9.mlp.up_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.9.post_attention_layernorm.weight": "model-00002-of-00005.safetensors",
+ "model.layers.9.self_attn.k_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.9.self_attn.o_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.9.self_attn.q_proj.weight": "model-00002-of-00005.safetensors",
+ "model.layers.9.self_attn.v_proj.weight": "model-00002-of-00005.safetensors",
+ "model.norm.weight": "model-00005-of-00005.safetensors"
+ }
+}
diff --git a/output.safetensors b/output.safetensors
new file mode 100644
index 0000000000000000000000000000000000000000..47ac06f6db4e8ad96eef32ab7befc08c3b1b628b
--- /dev/null
+++ b/output.safetensors
@@ -0,0 +1,3 @@
+version https://git-lfs.github.com/spec/v1
+oid sha256:a3ce31fa14eac00a6e67f2f23f18ce812e0e3961a2bcae4a9928d1843b284de7
+size 8150244828
diff --git a/special_tokens_map.json b/special_tokens_map.json
new file mode 100644
index 0000000000000000000000000000000000000000..547986c0f0567961dd9ff48218defed8e4d5aa59
--- /dev/null
+++ b/special_tokens_map.json
@@ -0,0 +1,30 @@
+{
+ "bos_token": {
+ "content": "",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ },
+ "eos_token": {
+ "content": "<|im_end|>",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ },
+ "pad_token": {
+ "content": "",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ },
+ "unk_token": {
+ "content": "",
+ "lstrip": false,
+ "normalized": false,
+ "rstrip": false,
+ "single_word": false
+ }
+}
diff --git a/tokenizer.json b/tokenizer.json
new file mode 100644
index 0000000000000000000000000000000000000000..22aa6f372f81cf778a32e12f9b0113a8f2525231
--- /dev/null
+++ b/tokenizer.json
@@ -0,0 +1,409661 @@
+{
+ "version": "1.0",
+ "truncation": null,
+ "padding": null,
+ "added_tokens": [
+ {
+ "id": 0,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 1,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 2,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 3,
+ "content": "[INST]",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 4,
+ "content": "[/INST]",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 5,
+ "content": "[AVAILABLE_TOOLS]",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 6,
+ "content": "[/AVAILABLE_TOOLS]",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 7,
+ "content": "[TOOL_RESULTS]",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 8,
+ "content": "[/TOOL_RESULTS]",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 9,
+ "content": "[TOOL_CALLS]",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 10,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 11,
+ "content": "[PREFIX]",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 12,
+ "content": "[MIDDLE]",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 13,
+ "content": "[SUFFIX]",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 14,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 15,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 16,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 17,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 18,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 19,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 20,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 21,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 22,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 23,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 24,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 25,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 26,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 27,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 28,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 29,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 30,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 31,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 32,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 33,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 34,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 35,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 36,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 37,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 38,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 39,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 40,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 41,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 42,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 43,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 44,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 45,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 46,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 47,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 48,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 49,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 50,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 51,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 52,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 53,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 54,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 55,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 56,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 57,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 58,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 59,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 60,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 61,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 62,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 63,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 64,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 65,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 66,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 67,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 68,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 69,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 70,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 71,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 72,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 73,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 74,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 75,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 76,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 77,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 78,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 79,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 80,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 81,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 82,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 83,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 84,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 85,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 86,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 87,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 88,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 89,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 90,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 91,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 92,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 93,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 94,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 95,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 96,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 97,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 98,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 99,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 100,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 101,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 102,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 103,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 104,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 105,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 106,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 107,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 108,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 109,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 110,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 111,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 112,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 113,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 114,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 115,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 116,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 117,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 118,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 119,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 120,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 121,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 122,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 123,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 124,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 125,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 126,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 127,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 128,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 129,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 130,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 131,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 132,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 133,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 134,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 135,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 136,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 137,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 138,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 139,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 140,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 141,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 142,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 143,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 144,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 145,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 146,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 147,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 148,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 149,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 150,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 151,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 152,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 153,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 154,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 155,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 156,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 157,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 158,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 159,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 160,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 161,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 162,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 163,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 164,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 165,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 166,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 167,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 168,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 169,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 170,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 171,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 172,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 173,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 174,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 175,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 176,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 177,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 178,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 179,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 180,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 181,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 182,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 183,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 184,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 185,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 186,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 187,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 188,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 189,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 190,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 191,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 192,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 193,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 194,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 195,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 196,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 197,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 198,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 199,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 200,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 201,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 202,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 203,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 204,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 205,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 206,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 207,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 208,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 209,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 210,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 211,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 212,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 213,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 214,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 215,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 216,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 217,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 218,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 219,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 220,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 221,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 222,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 223,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 224,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 225,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 226,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 227,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 228,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 229,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 230,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 231,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 232,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 233,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 234,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 235,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 236,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 237,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 238,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 239,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 240,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 241,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 242,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 243,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 244,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 245,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 246,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 247,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 248,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 249,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 250,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 251,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 252,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 253,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 254,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 255,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 256,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 257,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 258,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 259,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 260,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 261,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 262,
+ "content": "",
+ "single_word": false,
+ "lstrip": false,
+ "rstrip": false,
+ "normalized": false,
+ "special": true
+ },
+ {
+ "id": 263,
+ "content": "