CodeLlama-70b-Instruct-hf-Q2_K.gguf
|
Q2_K
|
2
|
25.5 GB
|
smallest, significant quality loss - not recommended for most purposes
|
CodeLlama-70b-Instruct-hf-Q3_K_L.gguf
|
Q3_K_L
|
3
|
36.1 GB
|
small, substantial quality loss
|
CodeLlama-70b-Instruct-hf-Q3_K_M.gguf
|
Q3_K_M
|
3
|
33.3 GB
|
very small, high quality loss
|
CodeLlama-70b-Instruct-hf-Q3_K_S.gguf
|
Q3_K_S
|
3
|
29.9 GB
|
very small, high quality loss
|
CodeLlama-70b-Instruct-hf-Q4_0.gguf
|
Q4_0
|
4
|
38.9 GB
|
legacy; small, very high quality loss - prefer using Q3_K_M
|
CodeLlama-70b-Instruct-hf-Q4_K_M.gguf
|
Q4_K_M
|
4
|
41.4 GB
|
medium, balanced quality - recommended
|
CodeLlama-70b-Instruct-hf-Q4_K_S.gguf
|
Q4_K_S
|
4
|
39.2 GB
|
small, greater quality loss
|
CodeLlama-70b-Instruct-hf-Q5_0.gguf
|
Q5_0
|
5
|
47.5 GB
|
legacy; medium, balanced quality - prefer using Q4_K_M
|
CodeLlama-70b-Instruct-hf-Q5_K_M.gguf
|
Q5_K_M
|
5
|
48.8 GB
|
large, very low quality loss - recommended
|
CodeLlama-70b-Instruct-hf-Q5_K_S.gguf
|
Q5_K_S
|
5
|
47.5 GB
|
large, low quality loss - recommended
|