Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Kquant03
/
Samlagast-7B-GGUF
like
0
GGUF
English
mergekit
Merge
arxiv:
2212.04089
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
Samlagast-7B-GGUF
38.5 GB
1 contributor
History:
15 commits
Kquant03
Update README.md
26e4b94
verified
almost 2 years ago
.gitattributes
Safe
2.55 kB
Rename ggml-model-q8_0.gguf to Samlagast-7B-ggml-model-q8_0.gguf
almost 2 years ago
README.md
Safe
1.64 kB
Update README.md
almost 2 years ago
Samlagast-7B-ggml-model-q2_k.gguf
Safe
2.72 GB
xet
Rename ggml-model-q2_k.gguf to Samlagast-7B-ggml-model-q2_k.gguf
almost 2 years ago
Samlagast-7B-ggml-model-q3_k_m.gguf
Safe
3.52 GB
xet
Rename ggml-model-q3_k_m.gguf to Samlagast-7B-ggml-model-q3_k_m.gguf
almost 2 years ago
Samlagast-7B-ggml-model-q4_0.gguf
Safe
4.11 GB
xet
Rename ggml-model-q4_0.gguf to Samlagast-7B-ggml-model-q4_0.gguf
almost 2 years ago
Samlagast-7B-ggml-model-q4_k_m.gguf
Safe
4.37 GB
xet
Rename ggml-model-q4_k_m.gguf to Samlagast-7B-ggml-model-q4_k_m.gguf
almost 2 years ago
Samlagast-7B-ggml-model-q5_0.gguf
Safe
5 GB
xet
Rename ggml-model-q5_0.gguf to Samlagast-7B-ggml-model-q5_0.gguf
almost 2 years ago
Samlagast-7B-ggml-model-q5_k_m.gguf
Safe
5.13 GB
xet
Rename ggml-model-q5_k_m.gguf to Samlagast-7B-ggml-model-q5_k_m.gguf
almost 2 years ago
Samlagast-7B-ggml-model-q6_k.gguf
Safe
5.94 GB
xet
Rename ggml-model-q6_k.gguf to Samlagast-7B-ggml-model-q6_k.gguf
almost 2 years ago
Samlagast-7B-ggml-model-q8_0.gguf
Safe
7.7 GB
xet
Rename ggml-model-q8_0.gguf to Samlagast-7B-ggml-model-q8_0.gguf
almost 2 years ago