typhoon-7b / README.md
kasimat's picture
Update README.md
1ce3a30
|
raw
history blame
No virus
1.26 kB
metadata
license: apache-2.0
language:
  - th
library_name: transformers
pipeline_tag: text-generation
tags:
  - pretrained

Model Card for Typhoon-7B

Typhoon 7B is a pretrained Thai language adaption of Mistral-7B with 7 billion parameters.

Typhoon 7B outperforms all open-source Thai language models as of this publishing, and its performance is on par with GPT-3.5 while being 2.62 times more efficient.

Typhoon benchmark

For full details of this model please read our paper and release blog post.

Requirements

Transformers, 4.34.0 or newer.

Model date

Typhoon 7B was trained at December, 2023.

License

Apache-2.0 (Commercial)

Notice

Typhoon 7B is a pretrained base model; it cannot understand human instructions without using a few-shot or fine-tune approach on an instruct dataset, and does not have any moderation mechanisms.

SCB10X AI Team

Kunat Pipatanakul, Phatrasek Jirabovonvisut, Potsawee Manakul, Sittipong Sripaisarnmongkol, Ruangsak Patomwong, Pathomporn Chokchainant, Kasima Tharnpipitchai