Meta-Llama-3-70B-Instruct-Q2_K.gguf
|
Q2_K
|
2
|
26.4 GB
|
smallest, significant quality loss - not recommended for most purposes
|
Meta-Llama-3-70B-Instruct-Q3_K_L.gguf
|
Q3_K_L
|
3
|
37.1 GB
|
small, substantial quality loss
|
Meta-Llama-3-70B-Instruct-Q3_K_M.gguf
|
Q3_K_M
|
3
|
34.3 GB
|
very small, high quality loss
|
Meta-Llama-3-70B-Instruct-Q3_K_S.gguf
|
Q3_K_S
|
3
|
30.9 GB
|
very small, high quality loss
|
Meta-Llama-3-70B-Instruct-Q4_0.gguf
|
Q4_0
|
4
|
40 GB
|
legacy; small, very high quality loss - prefer using Q3_K_M
|
Meta-Llama-3-70B-Instruct-Q4_K_M.gguf
|
Q4_K_M
|
4
|
42.5 GB
|
medium, balanced quality - recommended
|
Meta-Llama-3-70B-Instruct-Q5_0.gguf
|
Q5_0
|
5
|
48.7 GB
|
legacy; medium, balanced quality - prefer using Q4_K_M
|
Meta-Llama-3-70B-Instruct-Q5_K_M.gguf
|
Q5_K_M
|
5
|
50 GB
|
large, very low quality loss - recommended
|
Meta-Llama-3-70B-Instruct-Q5_K_S.gguf
|
Q5_K_S
|
5
|
48.7 GB
|
large, low quality loss - recommended
|
Meta-Llama-3-70B-Instruct-Q6_K-00001-of-00002.gguf
|
Q6_K
|
6
|
32.1 GB
|
very large, extremely low quality loss
|
Meta-Llama-3-70B-Instruct-Q6_K-00002-of-00002.gguf
|
Q6_K
|
6
|
25.7 GB
|
very large, extremely low quality loss
|
Meta-Llama-3-70B-Instruct-Q8_0-00001-of-00003.gguf
|
Q8_0
|
8
|
32 GB
|
very large, extremely low quality loss - not recommended
|
Meta-Llama-3-70B-Instruct-Q8_0-00002-of-00003.gguf
|
Q8_0
|
8
|
32.1 GB
|
very large, extremely low quality loss - not recommended
|
Meta-Llama-3-70B-Instruct-Q8_0-00003-of-00003.gguf
|
Q8_0
|
8
|
10.9 GB
|
very large, extremely low quality loss - not recommended
|
Meta-Llama-3-70B-Instruct-f16-00001-of-00005.gguf
|
f16
|
16
|
32.1 GB
|
|
Meta-Llama-3-70B-Instruct-f16-00002-of-00005.gguf
|
f16
|
16
|
32 GB
|
|
Meta-Llama-3-70B-Instruct-f16-00003-of-00005.gguf
|
f16
|
16
|
32 GB
|
|
Meta-Llama-3-70B-Instruct-f16-00004-of-00005.gguf
|
f16
|
16
|
31.7 GB
|
|
Meta-Llama-3-70B-Instruct-f16-00005-of-00005.gguf
|
f16
|
16
|
13.1 GB
|
|