CodeQwen1.5-7B-Chat-Q2_K.gguf
|
Q2_K
|
2
|
3.05 GB
|
smallest, significant quality loss - not recommended for most purposes
|
CodeQwen1.5-7B-Chat-Q3_K_L.gguf
|
Q3_K_L
|
3
|
3.99 GB
|
small, substantial quality loss
|
CodeQwen1.5-7B-Chat-Q3_K_M.gguf
|
Q3_K_M
|
3
|
3.81 GB
|
very small, high quality loss
|
CodeQwen1.5-7B-Chat-Q3_K_S.gguf
|
Q3_K_S
|
3
|
3.5 GB
|
very small, high quality loss
|
CodeQwen1.5-7B-Chat-Q4_0.gguf
|
Q4_0
|
4
|
4.18 GB
|
legacy; small, very high quality loss - prefer using Q3_K_M
|
CodeQwen1.5-7B-Chat-Q4_K_M.gguf
|
Q4_K_M
|
4
|
4.74 GB
|
medium, balanced quality - recommended
|
CodeQwen1.5-7B-Chat-Q4_K_S.gguf
|
Q4_K_S
|
4
|
4.41 GB
|
small, greater quality loss
|
CodeQwen1.5-7B-Chat-Q5_0.gguf
|
Q5_0
|
5
|
5.04 GB
|
legacy; medium, balanced quality - prefer using Q4_K_M
|
CodeQwen1.5-7B-Chat-Q5_K_M.gguf
|
Q5_K_M
|
5
|
5.43 GB
|
large, very low quality loss - recommended
|
CodeQwen1.5-7B-Chat-Q5_K_S.gguf
|
Q5_K_S
|
5
|
5.15 GB
|
large, low quality loss - recommended
|
CodeQwen1.5-7B-Chat-Q6_K.gguf
|
Q6_K
|
6
|
6.38 GB
|
very large, extremely low quality loss
|
CodeQwen1.5-7B-Chat-Q8_0.gguf
|
Q8_0
|
8
|
7.71 GB
|
very large, extremely low quality loss - not recommended
|
CodeQwen1.5-7B-Chat-f16.gguf
|
f16
|
16
|
14.5 GB
|
|