Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

phanerozoic
/
deep-plantain

Depth Estimation
Diffusers
Safetensors
English
lora
flux2
vision-banana
Model card Files Files and versions
xet
Community

Instructions to use phanerozoic/deep-plantain with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • Diffusers

    How to use phanerozoic/deep-plantain with Diffusers:

    pip install -U diffusers transformers accelerate
    import torch
    from diffusers import DiffusionPipeline
    
    # switch to "mps" for apple devices
    pipe = DiffusionPipeline.from_pretrained("black-forest-labs/FLUX.2-klein-base-4B", dtype=torch.bfloat16, device_map="cuda")
    pipe.load_lora_weights("phanerozoic/deep-plantain")
    
    prompt = "Astronaut in a jungle, cold color palette, muted colors, detailed, 8k"
    image = pipe(prompt).images[0]
  • Notebooks
  • Google Colab
  • Kaggle
  • Local Apps
  • Draw Things
deep-plantain / pending /tokenizer
15.9 MB
Ctrl+K
Ctrl+K
  • 1 contributor
History: 1 commit
phanerozoic's picture
phanerozoic
Add pending/ β€” rank-256 transformer LoRA + merged text-encoder (deep-plantain-3500)
377ef64 verified 9 days ago
  • added_tokens.json
    735 Bytes
    Add pending/ β€” rank-256 transformer LoRA + merged text-encoder (deep-plantain-3500) 9 days ago
  • chat_template.jinja
    4.26 kB
    Add pending/ β€” rank-256 transformer LoRA + merged text-encoder (deep-plantain-3500) 9 days ago
  • merges.txt
    1.67 MB
    Add pending/ β€” rank-256 transformer LoRA + merged text-encoder (deep-plantain-3500) 9 days ago
  • special_tokens_map.json
    644 Bytes
    Add pending/ β€” rank-256 transformer LoRA + merged text-encoder (deep-plantain-3500) 9 days ago
  • tokenizer.json
    11.4 MB
    xet
    Add pending/ β€” rank-256 transformer LoRA + merged text-encoder (deep-plantain-3500) 9 days ago
  • tokenizer_config.json
    5.64 kB
    Add pending/ β€” rank-256 transformer LoRA + merged text-encoder (deep-plantain-3500) 9 days ago
  • vocab.json
    2.78 MB
    Add pending/ β€” rank-256 transformer LoRA + merged text-encoder (deep-plantain-3500) 9 days ago