# Hugging Face Publish ## 1) Go to release folder ```bash cd /Users/robert/bibleai-backup/release/BibleAI-Gemma4-E4B-CPT-SFT-DPO-20260414 ``` ## 2) Verify GGUF checksums ```bash sha256sum gguf/final_merged.BF16.gguf gguf/final_merged.Q8_0.gguf ``` Expected: - `e07e38d28d3032d3b438b7b8b90cbf4cf5e66177b52e8f60673cac3586dc10a1` `final_merged.BF16.gguf` - `3c7f5f9caf080fe44720f16b5f4b5e7e95a097d6be3d1d8d89aea22e8574bad1` `final_merged.Q8_0.gguf` ## 3) Log in to Hugging Face ```bash huggingface-cli login ``` or ```bash hf auth login ``` ## 4) Push the full release ```bash HF_REPO=/ ./scripts/upload_to_hf.sh ``` Example: ```bash HF_REPO=rhemabible/BibleAI ./scripts/upload_to_hf.sh ``` This pushes: - HF merged model (`hf/`) - GGUF (`gguf/` BF16 + Q8_0) - Ollama Modelfiles (`ollama/`) - adapters, checksums, logs, docs Canonical names: - HF repo: `rhemabible/BibleAI` - Ollama models: `bibleaiq8` and `bibleaibf16`