glm-4-9b-chat / modeling_chatglm.py

Commit History

Merge branch 'main' into attention
29038ea

duzx16 commited on

Add support for flash attention 2
a7eaddd

duzx16 commited on

Add output_attentions for ChatGLMForSequenceClassification
f07eca1

duzx16 commited on

Add eager and sdpa attention implementations
835c717

duzx16 commited on

Merge branch 'main' of https://huggingface.co/THUDM/glm-4-9b-chat
dba7772

duzx16 commited on

Fix default dtype for classification head
0deb1dd

duzx16 commited on

Fix classification model
12c8049

duzx16 commited on

first commit
4b556ad

Ubuntu commited on