Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
second-state
/
moxin-reasoning-7b-GGUF
like
0
Follow
Second State
277
GGUF
mistral
conversational
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
moxin-reasoning-7b-GGUF
77.4 GB
Ctrl+K
Ctrl+K
1 contributor
History:
10 commits
apepkuss79
Update README.md
a5b77b1
verified
11 months ago
.gitattributes
Safe
2.38 kB
Update models
12 months ago
README.md
4.09 kB
Update README.md
11 months ago
config.json
Safe
735 Bytes
Update models
12 months ago
moxin-reasoning-7b-Q2_K.gguf
Safe
3.04 GB
xet
Update models
12 months ago
moxin-reasoning-7b-Q3_K_L.gguf
Safe
4.28 GB
xet
Update models
12 months ago
moxin-reasoning-7b-Q3_K_M.gguf
Safe
3.94 GB
xet
Update models
12 months ago
moxin-reasoning-7b-Q3_K_S.gguf
Safe
3.54 GB
xet
Update models
12 months ago
moxin-reasoning-7b-Q4_0.gguf
Safe
4.6 GB
xet
Update models
12 months ago
moxin-reasoning-7b-Q4_K_M.gguf
Safe
4.89 GB
xet
Update models
12 months ago
moxin-reasoning-7b-Q4_K_S.gguf
Safe
4.63 GB
xet
Update models
12 months ago
moxin-reasoning-7b-Q5_0.gguf
Safe
5.6 GB
xet
Update models
12 months ago
moxin-reasoning-7b-Q5_K_M.gguf
Safe
5.75 GB
xet
Update models
12 months ago
moxin-reasoning-7b-Q5_K_S.gguf
Safe
5.6 GB
xet
Update models
12 months ago
moxin-reasoning-7b-Q6_K.gguf
Safe
6.66 GB
xet
Update models
12 months ago
moxin-reasoning-7b-Q8_0.gguf
Safe
8.62 GB
xet
Update models
12 months ago
moxin-reasoning-7b-f16.gguf
Safe
16.2 GB
xet
Update models
12 months ago