Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
Momix-44
/
gemma-4-31B-it-heretic-v2
like
3
Image-Text-to-Text
Transformers
Safetensors
GGUF
gemma4
heretic
uncensored
decensored
abliterated
conversational
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
gemma-4-31B-it-heretic-v2
216 GB
Ctrl+K
Ctrl+K
1 contributor
History:
30 commits
Momix-44
Delete model-00002-of-00002.safetensors
7b4f47b
verified
10 days ago
.gitattributes
Safe
1.71 kB
Upload folder using huggingface_hub
14 days ago
README.md
Safe
27.7 kB
Upload folder using huggingface_hub
15 days ago
bartowski_calibration_datav5.txt
Safe
1.72 MB
Scheduled Commit
18 days ago
bartowski_gemma4_calibration_optimized_92k_2.txt
Safe
429 kB
Upload folder using huggingface_hub
17 days ago
chat_template.jinja
Safe
16.3 kB
Update chat_template.jinja
10 days ago
config.json
Safe
4.65 kB
Upload folder using huggingface_hub
18 days ago
dual-optimized.imatrix.dat
pickle
Pickle imports
No problematic imports detected
What is a pickle import?
13.8 MB
xet
Upload folder using huggingface_hub
16 days ago
dual-optimized.imatrix.gguf
pickle
Pickle imports
No problematic imports detected
What is a pickle import?
13.8 MB
xet
Upload folder using huggingface_hub
17 days ago
gemma-4-31B-it-heretic-v2-BF16.gguf
61.4 GB
xet
Upload folder using huggingface_hub
14 days ago
gemma-4-31B-it-heretic-v2-IQ1_M.gguf
7.73 GB
xet
Upload folder using huggingface_hub
16 days ago
gemma-4-31B-it-heretic-v2-IQ2_M.gguf
10.9 GB
xet
Upload gemma-4-31B-it-heretic-v2-IQ2_M.gguf with huggingface_hub
17 days ago
gemma-4-31B-it-heretic-v2-IQ2_XS.gguf
9.53 GB
xet
Upload folder using huggingface_hub
16 days ago
gemma-4-31B-it-heretic-v2-IQ2_XXS.gguf
8.67 GB
xet
Upload folder using huggingface_hub
16 days ago
gemma-4-31B-it-heretic-v2-IQ3_XS.gguf
13.1 GB
xet
Upload folder using huggingface_hub
15 days ago
gemma-4-31B-it-heretic-v2-IQ3_XXS.gguf
12.1 GB
xet
Upload folder using huggingface_hub
15 days ago
gemma-4-31B-it-heretic-v2-IQ4_NL.gguf
17.7 GB
xet
Upload folder using huggingface_hub
15 days ago
gemma-4-31B-it-heretic-v2-IQ4_XS.gguf
16.7 GB
xet
Upload folder using huggingface_hub
16 days ago
gemma-4-31B-it-heretic-v2-Q4_K_M.gguf
18.7 GB
xet
Upload folder using huggingface_hub
15 days ago
gemma-4-31B-it-heretic-v2-Q4_K_S.gguf
17.8 GB
xet
Upload folder using huggingface_hub
15 days ago
gemma-4-31B-it-heretic-v2-Q5_K_S.gguf
21.3 GB
xet
Upload folder using huggingface_hub
15 days ago
generation_config.json
Safe
203 Bytes
Upload folder using huggingface_hub
18 days ago
imatrix_generation_log_gemma-4-31B-it-heretic-v2_bartowski-datav5.txt
Safe
20.8 kB
Upload folder using huggingface_hub
16 days ago
model.safetensors.index.json
Safe
119 kB
Upload folder using huggingface_hub
18 days ago
preprocessor_config.json
Safe
1.69 kB
Upload folder using huggingface_hub
18 days ago
tokenizer.json
Safe
32.2 MB
xet
Upload folder using huggingface_hub
18 days ago
tokenizer_config.json
Safe
2.69 kB
Upload folder using huggingface_hub
18 days ago
wikitext-2-raw-v1.zip
pickle
Pickle imports
No problematic imports detected
What is a pickle import?
4.72 MB
xet
Upload folder using huggingface_hub
16 days ago