Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
mradermacher
/
Med-Qwen2-7B-GGUF
like
1
Transformers
GGUF
Malikeh1375/medical-question-answering-datasets
English
Inference Endpoints
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
Med-Qwen2-7B-GGUF
1 contributor
History:
51 commits
mradermacher
auto-patch README.md
1038eb7
verified
4 months ago
.gitattributes
3.15 kB
Copy mradermacher/Med-Qwen2-7B-GGUF/.f16.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.f16.gguf
4 months ago
Med-Qwen2-7B.IQ3_M.gguf
3.57 GB
LFS
Copy mradermacher/Med-Qwen2-7B-GGUF/.IQ3_M.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.IQ3_M.gguf
4 months ago
Med-Qwen2-7B.IQ3_S.gguf
3.5 GB
LFS
Copy mradermacher/Med-Qwen2-7B-GGUF/.IQ3_S.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.IQ3_S.gguf
4 months ago
Med-Qwen2-7B.IQ3_XS.gguf
3.35 GB
LFS
Copy mradermacher/Med-Qwen2-7B-GGUF/.IQ3_XS.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.IQ3_XS.gguf
4 months ago
Med-Qwen2-7B.IQ4_XS.gguf
4.25 GB
LFS
Copy mradermacher/Med-Qwen2-7B-GGUF/.IQ4_XS.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.IQ4_XS.gguf
4 months ago
Med-Qwen2-7B.Q2_K.gguf
3.02 GB
LFS
Copy mradermacher/Med-Qwen2-7B-GGUF/.Q2_K.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.Q2_K.gguf
4 months ago
Med-Qwen2-7B.Q3_K_L.gguf
4.09 GB
LFS
Copy mradermacher/Med-Qwen2-7B-GGUF/.Q3_K_L.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.Q3_K_L.gguf
4 months ago
Med-Qwen2-7B.Q3_K_M.gguf
3.81 GB
LFS
Copy mradermacher/Med-Qwen2-7B-GGUF/.Q3_K_M.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.Q3_K_M.gguf
4 months ago
Med-Qwen2-7B.Q3_K_S.gguf
3.49 GB
LFS
Copy mradermacher/Med-Qwen2-7B-GGUF/.Q3_K_S.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.Q3_K_S.gguf
4 months ago
Med-Qwen2-7B.Q4_K_M.gguf
4.68 GB
LFS
Copy mradermacher/Med-Qwen2-7B-GGUF/.Q4_K_M.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.Q4_K_M.gguf
4 months ago
Med-Qwen2-7B.Q4_K_S.gguf
4.46 GB
LFS
Copy mradermacher/Med-Qwen2-7B-GGUF/.Q4_K_S.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.Q4_K_S.gguf
4 months ago
Med-Qwen2-7B.Q5_K_M.gguf
5.44 GB
LFS
Copy mradermacher/Med-Qwen2-7B-GGUF/.Q5_K_M.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.Q5_K_M.gguf
4 months ago
Med-Qwen2-7B.Q5_K_S.gguf
5.32 GB
LFS
Copy mradermacher/Med-Qwen2-7B-GGUF/.Q5_K_S.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.Q5_K_S.gguf
4 months ago
Med-Qwen2-7B.Q6_K.gguf
6.25 GB
LFS
Copy mradermacher/Med-Qwen2-7B-GGUF/.Q6_K.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.Q6_K.gguf
4 months ago
Med-Qwen2-7B.Q8_0.gguf
8.1 GB
LFS
Copy mradermacher/Med-Qwen2-7B-GGUF/.Q8_0.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.Q8_0.gguf
4 months ago
Med-Qwen2-7B.f16.gguf
15.2 GB
LFS
Copy mradermacher/Med-Qwen2-7B-GGUF/.f16.gguf to mradermacher/Med-Qwen2-7B-GGUF/Med-Qwen2-7B.f16.gguf
4 months ago
README.md
3.66 kB
auto-patch README.md
4 months ago