YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

Bharat PII Gemma 3 1B (GGUF)

Files

  • bharat-pii-gemma-3-1b-it-v0.3-f16.gguf
  • bharat-pii-gemma-3-1b-it-v0.3_q8_0.gguf

Notes

  • GGUF is self-contained (tokenizer/config embedded).
  • Converted with llama.cpp. Suggested: q8_0 for general use, f16 for reference.
Downloads last month
11
GGUF
Model size
1.0B params
Architecture
gemma3
Hardware compatibility
Log In to add your hardware

8-bit

16-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support