Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
legraphista
/
glm-4-9b-chat-GGUF
like
18
Text Generation
GGUF
Chinese
English
glm
chatglm
thudm
quantized
GGUF
quantization
static
16bit
8bit
6bit
5bit
4bit
3bit
2bit
License:
glm-4
Model card
Files
Files and versions
Community
Use this model
main
glm-4-9b-chat-GGUF
1 contributor
History:
33 commits
legraphista
Upload README.md with huggingface_hub
0155a14
verified
5 months ago
.gitattributes
Safe
2.56 kB
Upload glm-4-9b-chat.IQ3_XS.gguf with huggingface_hub
5 months ago
README.md
Safe
7.52 kB
Upload README.md with huggingface_hub
5 months ago
glm-4-9b-chat.BF16.gguf
Safe
18.8 GB
LFS
Upload glm-4-9b-chat.BF16.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.FP16.gguf
Safe
18.8 GB
LFS
Upload glm-4-9b-chat.FP16.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.IQ3_M.gguf
Safe
4.81 GB
LFS
Upload glm-4-9b-chat.IQ3_M.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.IQ3_S.gguf
Safe
4.59 GB
LFS
Upload glm-4-9b-chat.IQ3_S.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.IQ3_XS.gguf
Safe
4.43 GB
LFS
Upload glm-4-9b-chat.IQ3_XS.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.IQ4_NL.gguf
Safe
5.51 GB
LFS
Upload glm-4-9b-chat.IQ4_NL.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.IQ4_XS.gguf
Safe
5.3 GB
LFS
Upload glm-4-9b-chat.IQ4_XS.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.Q2_K.gguf
Safe
3.99 GB
LFS
Upload glm-4-9b-chat.Q2_K.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.Q3_K.gguf
Safe
5.06 GB
LFS
Upload glm-4-9b-chat.Q3_K.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.Q3_K_L.gguf
Safe
5.28 GB
LFS
Upload glm-4-9b-chat.Q3_K_L.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.Q3_K_S.gguf
Safe
4.59 GB
LFS
Upload glm-4-9b-chat.Q3_K_S.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.Q4_K.gguf
Safe
6.25 GB
LFS
Upload glm-4-9b-chat.Q4_K.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.Q4_K_S.gguf
Safe
5.75 GB
LFS
Upload glm-4-9b-chat.Q4_K_S.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.Q5_K.gguf
Safe
7.14 GB
LFS
Upload glm-4-9b-chat.Q5_K.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.Q5_K_S.gguf
Safe
6.69 GB
LFS
Upload glm-4-9b-chat.Q5_K_S.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.Q6_K.gguf
Safe
8.26 GB
LFS
Upload glm-4-9b-chat.Q6_K.gguf with huggingface_hub
5 months ago
glm-4-9b-chat.Q8_0.gguf
Safe
9.99 GB
LFS
Upload glm-4-9b-chat.Q8_0.gguf with huggingface_hub
5 months ago