TobiGeth commited on
Commit
ac8396c
·
verified ·
1 Parent(s): ebf1498

Upload folder using huggingface_hub

Browse files
Files changed (3) hide show
  1. README.md +46 -0
  2. config.yaml +328 -0
  3. lora.safetensors +3 -0
README.md ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ license_name: flux-1-dev-non-commercial-license
4
+ license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
5
+ language:
6
+ - en
7
+ tags:
8
+ - flux
9
+ - diffusers
10
+ - lora
11
+ - replicate
12
+ base_model: "black-forest-labs/FLUX.1-dev"
13
+ pipeline_tag: text-to-image
14
+ # widget:
15
+ # - text: >-
16
+ # prompt
17
+ # output:
18
+ # url: https://...
19
+ instance_prompt: KHKuser
20
+ ---
21
+
22
+ # Khiki2
23
+
24
+ <Gallery />
25
+
26
+ Trained on Replicate using:
27
+
28
+ https://replicate.com/ostris/flux-dev-lora-trainer/train
29
+
30
+
31
+ ## Trigger words
32
+ You should use `KHKuser` to trigger the image generation.
33
+
34
+
35
+ ## Use it with the [🧨 diffusers library](https://github.com/huggingface/diffusers)
36
+
37
+ ```py
38
+ from diffusers import AutoPipelineForText2Image
39
+ import torch
40
+
41
+ pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.float16).to('cuda')
42
+ pipeline.load_lora_weights('TobiGeth/khiki2', weight_name='lora.safetensors')
43
+ image = pipeline('your prompt').images[0]
44
+ ```
45
+
46
+ For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
config.yaml ADDED
@@ -0,0 +1,328 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ job: custom_job
2
+ config:
3
+ name: flux_train_replicate
4
+ process:
5
+ - type: custom_sd_trainer
6
+ training_folder: output
7
+ device: cuda:0
8
+ trigger_word: KHKuser
9
+ network:
10
+ type: lora
11
+ linear: 32
12
+ linear_alpha: 32
13
+ network_kwargs:
14
+ only_if_contains:
15
+ - transformer.transformer_blocks.0.attn.to_q
16
+ - transformer.transformer_blocks.0.attn.to_k
17
+ - transformer.transformer_blocks.0.attn.to_v
18
+ - transformer.transformer_blocks.0.attn.add_k_proj
19
+ - transformer.transformer_blocks.0.attn.add_v_proj
20
+ - transformer.transformer_blocks.0.attn.add_q_proj
21
+ - transformer.transformer_blocks.0.attn.to_out.0
22
+ - transformer.transformer_blocks.0.attn.to_add_out
23
+ - transformer.transformer_blocks.1.attn.to_q
24
+ - transformer.transformer_blocks.1.attn.to_k
25
+ - transformer.transformer_blocks.1.attn.to_v
26
+ - transformer.transformer_blocks.1.attn.add_k_proj
27
+ - transformer.transformer_blocks.1.attn.add_v_proj
28
+ - transformer.transformer_blocks.1.attn.add_q_proj
29
+ - transformer.transformer_blocks.1.attn.to_out.0
30
+ - transformer.transformer_blocks.1.attn.to_add_out
31
+ - transformer.transformer_blocks.2.attn.to_q
32
+ - transformer.transformer_blocks.2.attn.to_k
33
+ - transformer.transformer_blocks.2.attn.to_v
34
+ - transformer.transformer_blocks.2.attn.add_k_proj
35
+ - transformer.transformer_blocks.2.attn.add_v_proj
36
+ - transformer.transformer_blocks.2.attn.add_q_proj
37
+ - transformer.transformer_blocks.2.attn.to_out.0
38
+ - transformer.transformer_blocks.2.attn.to_add_out
39
+ - transformer.transformer_blocks.3.attn.to_q
40
+ - transformer.transformer_blocks.3.attn.to_k
41
+ - transformer.transformer_blocks.3.attn.to_v
42
+ - transformer.transformer_blocks.3.attn.add_k_proj
43
+ - transformer.transformer_blocks.3.attn.add_v_proj
44
+ - transformer.transformer_blocks.3.attn.add_q_proj
45
+ - transformer.transformer_blocks.3.attn.to_out.0
46
+ - transformer.transformer_blocks.3.attn.to_add_out
47
+ - transformer.transformer_blocks.4.attn.to_q
48
+ - transformer.transformer_blocks.4.attn.to_k
49
+ - transformer.transformer_blocks.4.attn.to_v
50
+ - transformer.transformer_blocks.4.attn.add_k_proj
51
+ - transformer.transformer_blocks.4.attn.add_v_proj
52
+ - transformer.transformer_blocks.4.attn.add_q_proj
53
+ - transformer.transformer_blocks.4.attn.to_out.0
54
+ - transformer.transformer_blocks.4.attn.to_add_out
55
+ - transformer.transformer_blocks.5.attn.to_q
56
+ - transformer.transformer_blocks.5.attn.to_k
57
+ - transformer.transformer_blocks.5.attn.to_v
58
+ - transformer.transformer_blocks.5.attn.add_k_proj
59
+ - transformer.transformer_blocks.5.attn.add_v_proj
60
+ - transformer.transformer_blocks.5.attn.add_q_proj
61
+ - transformer.transformer_blocks.5.attn.to_out.0
62
+ - transformer.transformer_blocks.5.attn.to_add_out
63
+ - transformer.transformer_blocks.6.attn.to_q
64
+ - transformer.transformer_blocks.6.attn.to_k
65
+ - transformer.transformer_blocks.6.attn.to_v
66
+ - transformer.transformer_blocks.6.attn.add_k_proj
67
+ - transformer.transformer_blocks.6.attn.add_v_proj
68
+ - transformer.transformer_blocks.6.attn.add_q_proj
69
+ - transformer.transformer_blocks.6.attn.to_out.0
70
+ - transformer.transformer_blocks.6.attn.to_add_out
71
+ - transformer.transformer_blocks.7.attn.to_q
72
+ - transformer.transformer_blocks.7.attn.to_k
73
+ - transformer.transformer_blocks.7.attn.to_v
74
+ - transformer.transformer_blocks.7.attn.add_k_proj
75
+ - transformer.transformer_blocks.7.attn.add_v_proj
76
+ - transformer.transformer_blocks.7.attn.add_q_proj
77
+ - transformer.transformer_blocks.7.attn.to_out.0
78
+ - transformer.transformer_blocks.7.attn.to_add_out
79
+ - transformer.transformer_blocks.8.attn.to_q
80
+ - transformer.transformer_blocks.8.attn.to_k
81
+ - transformer.transformer_blocks.8.attn.to_v
82
+ - transformer.transformer_blocks.8.attn.add_k_proj
83
+ - transformer.transformer_blocks.8.attn.add_v_proj
84
+ - transformer.transformer_blocks.8.attn.add_q_proj
85
+ - transformer.transformer_blocks.8.attn.to_out.0
86
+ - transformer.transformer_blocks.8.attn.to_add_out
87
+ - transformer.transformer_blocks.9.attn.to_q
88
+ - transformer.transformer_blocks.9.attn.to_k
89
+ - transformer.transformer_blocks.9.attn.to_v
90
+ - transformer.transformer_blocks.9.attn.add_k_proj
91
+ - transformer.transformer_blocks.9.attn.add_v_proj
92
+ - transformer.transformer_blocks.9.attn.add_q_proj
93
+ - transformer.transformer_blocks.9.attn.to_out.0
94
+ - transformer.transformer_blocks.9.attn.to_add_out
95
+ - transformer.transformer_blocks.10.attn.to_q
96
+ - transformer.transformer_blocks.10.attn.to_k
97
+ - transformer.transformer_blocks.10.attn.to_v
98
+ - transformer.transformer_blocks.10.attn.add_k_proj
99
+ - transformer.transformer_blocks.10.attn.add_v_proj
100
+ - transformer.transformer_blocks.10.attn.add_q_proj
101
+ - transformer.transformer_blocks.10.attn.to_out.0
102
+ - transformer.transformer_blocks.10.attn.to_add_out
103
+ - transformer.transformer_blocks.11.attn.to_q
104
+ - transformer.transformer_blocks.11.attn.to_k
105
+ - transformer.transformer_blocks.11.attn.to_v
106
+ - transformer.transformer_blocks.11.attn.add_k_proj
107
+ - transformer.transformer_blocks.11.attn.add_v_proj
108
+ - transformer.transformer_blocks.11.attn.add_q_proj
109
+ - transformer.transformer_blocks.11.attn.to_out.0
110
+ - transformer.transformer_blocks.11.attn.to_add_out
111
+ - transformer.transformer_blocks.12.attn.to_q
112
+ - transformer.transformer_blocks.12.attn.to_k
113
+ - transformer.transformer_blocks.12.attn.to_v
114
+ - transformer.transformer_blocks.12.attn.add_k_proj
115
+ - transformer.transformer_blocks.12.attn.add_v_proj
116
+ - transformer.transformer_blocks.12.attn.add_q_proj
117
+ - transformer.transformer_blocks.12.attn.to_out.0
118
+ - transformer.transformer_blocks.12.attn.to_add_out
119
+ - transformer.transformer_blocks.13.attn.to_q
120
+ - transformer.transformer_blocks.13.attn.to_k
121
+ - transformer.transformer_blocks.13.attn.to_v
122
+ - transformer.transformer_blocks.13.attn.add_k_proj
123
+ - transformer.transformer_blocks.13.attn.add_v_proj
124
+ - transformer.transformer_blocks.13.attn.add_q_proj
125
+ - transformer.transformer_blocks.13.attn.to_out.0
126
+ - transformer.transformer_blocks.13.attn.to_add_out
127
+ - transformer.transformer_blocks.14.attn.to_q
128
+ - transformer.transformer_blocks.14.attn.to_k
129
+ - transformer.transformer_blocks.14.attn.to_v
130
+ - transformer.transformer_blocks.14.attn.add_k_proj
131
+ - transformer.transformer_blocks.14.attn.add_v_proj
132
+ - transformer.transformer_blocks.14.attn.add_q_proj
133
+ - transformer.transformer_blocks.14.attn.to_out.0
134
+ - transformer.transformer_blocks.14.attn.to_add_out
135
+ - transformer.transformer_blocks.15.attn.to_q
136
+ - transformer.transformer_blocks.15.attn.to_k
137
+ - transformer.transformer_blocks.15.attn.to_v
138
+ - transformer.transformer_blocks.15.attn.add_k_proj
139
+ - transformer.transformer_blocks.15.attn.add_v_proj
140
+ - transformer.transformer_blocks.15.attn.add_q_proj
141
+ - transformer.transformer_blocks.15.attn.to_out.0
142
+ - transformer.transformer_blocks.15.attn.to_add_out
143
+ - transformer.transformer_blocks.16.attn.to_q
144
+ - transformer.transformer_blocks.16.attn.to_k
145
+ - transformer.transformer_blocks.16.attn.to_v
146
+ - transformer.transformer_blocks.16.attn.add_k_proj
147
+ - transformer.transformer_blocks.16.attn.add_v_proj
148
+ - transformer.transformer_blocks.16.attn.add_q_proj
149
+ - transformer.transformer_blocks.16.attn.to_out.0
150
+ - transformer.transformer_blocks.16.attn.to_add_out
151
+ - transformer.transformer_blocks.17.attn.to_q
152
+ - transformer.transformer_blocks.17.attn.to_k
153
+ - transformer.transformer_blocks.17.attn.to_v
154
+ - transformer.transformer_blocks.17.attn.add_k_proj
155
+ - transformer.transformer_blocks.17.attn.add_v_proj
156
+ - transformer.transformer_blocks.17.attn.add_q_proj
157
+ - transformer.transformer_blocks.17.attn.to_out.0
158
+ - transformer.transformer_blocks.17.attn.to_add_out
159
+ - transformer.transformer_blocks.18.attn.to_q
160
+ - transformer.transformer_blocks.18.attn.to_k
161
+ - transformer.transformer_blocks.18.attn.to_v
162
+ - transformer.transformer_blocks.18.attn.add_k_proj
163
+ - transformer.transformer_blocks.18.attn.add_v_proj
164
+ - transformer.transformer_blocks.18.attn.add_q_proj
165
+ - transformer.transformer_blocks.18.attn.to_out.0
166
+ - transformer.transformer_blocks.18.attn.to_add_out
167
+ - transformer.single_transformer_blocks.0.attn.to_q
168
+ - transformer.single_transformer_blocks.0.attn.to_k
169
+ - transformer.single_transformer_blocks.0.attn.to_v
170
+ - transformer.single_transformer_blocks.1.attn.to_q
171
+ - transformer.single_transformer_blocks.1.attn.to_k
172
+ - transformer.single_transformer_blocks.1.attn.to_v
173
+ - transformer.single_transformer_blocks.2.attn.to_q
174
+ - transformer.single_transformer_blocks.2.attn.to_k
175
+ - transformer.single_transformer_blocks.2.attn.to_v
176
+ - transformer.single_transformer_blocks.3.attn.to_q
177
+ - transformer.single_transformer_blocks.3.attn.to_k
178
+ - transformer.single_transformer_blocks.3.attn.to_v
179
+ - transformer.single_transformer_blocks.4.attn.to_q
180
+ - transformer.single_transformer_blocks.4.attn.to_k
181
+ - transformer.single_transformer_blocks.4.attn.to_v
182
+ - transformer.single_transformer_blocks.5.attn.to_q
183
+ - transformer.single_transformer_blocks.5.attn.to_k
184
+ - transformer.single_transformer_blocks.5.attn.to_v
185
+ - transformer.single_transformer_blocks.6.attn.to_q
186
+ - transformer.single_transformer_blocks.6.attn.to_k
187
+ - transformer.single_transformer_blocks.6.attn.to_v
188
+ - transformer.single_transformer_blocks.7.attn.to_q
189
+ - transformer.single_transformer_blocks.7.attn.to_k
190
+ - transformer.single_transformer_blocks.7.attn.to_v
191
+ - transformer.single_transformer_blocks.8.attn.to_q
192
+ - transformer.single_transformer_blocks.8.attn.to_k
193
+ - transformer.single_transformer_blocks.8.attn.to_v
194
+ - transformer.single_transformer_blocks.9.attn.to_q
195
+ - transformer.single_transformer_blocks.9.attn.to_k
196
+ - transformer.single_transformer_blocks.9.attn.to_v
197
+ - transformer.single_transformer_blocks.10.attn.to_q
198
+ - transformer.single_transformer_blocks.10.attn.to_k
199
+ - transformer.single_transformer_blocks.10.attn.to_v
200
+ - transformer.single_transformer_blocks.11.attn.to_q
201
+ - transformer.single_transformer_blocks.11.attn.to_k
202
+ - transformer.single_transformer_blocks.11.attn.to_v
203
+ - transformer.single_transformer_blocks.12.attn.to_q
204
+ - transformer.single_transformer_blocks.12.attn.to_k
205
+ - transformer.single_transformer_blocks.12.attn.to_v
206
+ - transformer.single_transformer_blocks.13.attn.to_q
207
+ - transformer.single_transformer_blocks.13.attn.to_k
208
+ - transformer.single_transformer_blocks.13.attn.to_v
209
+ - transformer.single_transformer_blocks.14.attn.to_q
210
+ - transformer.single_transformer_blocks.14.attn.to_k
211
+ - transformer.single_transformer_blocks.14.attn.to_v
212
+ - transformer.single_transformer_blocks.15.attn.to_q
213
+ - transformer.single_transformer_blocks.15.attn.to_k
214
+ - transformer.single_transformer_blocks.15.attn.to_v
215
+ - transformer.single_transformer_blocks.16.attn.to_q
216
+ - transformer.single_transformer_blocks.16.attn.to_k
217
+ - transformer.single_transformer_blocks.16.attn.to_v
218
+ - transformer.single_transformer_blocks.17.attn.to_q
219
+ - transformer.single_transformer_blocks.17.attn.to_k
220
+ - transformer.single_transformer_blocks.17.attn.to_v
221
+ - transformer.single_transformer_blocks.18.attn.to_q
222
+ - transformer.single_transformer_blocks.18.attn.to_k
223
+ - transformer.single_transformer_blocks.18.attn.to_v
224
+ - transformer.single_transformer_blocks.19.attn.to_q
225
+ - transformer.single_transformer_blocks.19.attn.to_k
226
+ - transformer.single_transformer_blocks.19.attn.to_v
227
+ - transformer.single_transformer_blocks.20.attn.to_q
228
+ - transformer.single_transformer_blocks.20.attn.to_k
229
+ - transformer.single_transformer_blocks.20.attn.to_v
230
+ - transformer.single_transformer_blocks.21.attn.to_q
231
+ - transformer.single_transformer_blocks.21.attn.to_k
232
+ - transformer.single_transformer_blocks.21.attn.to_v
233
+ - transformer.single_transformer_blocks.22.attn.to_q
234
+ - transformer.single_transformer_blocks.22.attn.to_k
235
+ - transformer.single_transformer_blocks.22.attn.to_v
236
+ - transformer.single_transformer_blocks.23.attn.to_q
237
+ - transformer.single_transformer_blocks.23.attn.to_k
238
+ - transformer.single_transformer_blocks.23.attn.to_v
239
+ - transformer.single_transformer_blocks.24.attn.to_q
240
+ - transformer.single_transformer_blocks.24.attn.to_k
241
+ - transformer.single_transformer_blocks.24.attn.to_v
242
+ - transformer.single_transformer_blocks.25.attn.to_q
243
+ - transformer.single_transformer_blocks.25.attn.to_k
244
+ - transformer.single_transformer_blocks.25.attn.to_v
245
+ - transformer.single_transformer_blocks.26.attn.to_q
246
+ - transformer.single_transformer_blocks.26.attn.to_k
247
+ - transformer.single_transformer_blocks.26.attn.to_v
248
+ - transformer.single_transformer_blocks.27.attn.to_q
249
+ - transformer.single_transformer_blocks.27.attn.to_k
250
+ - transformer.single_transformer_blocks.27.attn.to_v
251
+ - transformer.single_transformer_blocks.28.attn.to_q
252
+ - transformer.single_transformer_blocks.28.attn.to_k
253
+ - transformer.single_transformer_blocks.28.attn.to_v
254
+ - transformer.single_transformer_blocks.29.attn.to_q
255
+ - transformer.single_transformer_blocks.29.attn.to_k
256
+ - transformer.single_transformer_blocks.29.attn.to_v
257
+ - transformer.single_transformer_blocks.30.attn.to_q
258
+ - transformer.single_transformer_blocks.30.attn.to_k
259
+ - transformer.single_transformer_blocks.30.attn.to_v
260
+ - transformer.single_transformer_blocks.31.attn.to_q
261
+ - transformer.single_transformer_blocks.31.attn.to_k
262
+ - transformer.single_transformer_blocks.31.attn.to_v
263
+ - transformer.single_transformer_blocks.32.attn.to_q
264
+ - transformer.single_transformer_blocks.32.attn.to_k
265
+ - transformer.single_transformer_blocks.32.attn.to_v
266
+ - transformer.single_transformer_blocks.33.attn.to_q
267
+ - transformer.single_transformer_blocks.33.attn.to_k
268
+ - transformer.single_transformer_blocks.33.attn.to_v
269
+ - transformer.single_transformer_blocks.34.attn.to_q
270
+ - transformer.single_transformer_blocks.34.attn.to_k
271
+ - transformer.single_transformer_blocks.34.attn.to_v
272
+ - transformer.single_transformer_blocks.35.attn.to_q
273
+ - transformer.single_transformer_blocks.35.attn.to_k
274
+ - transformer.single_transformer_blocks.35.attn.to_v
275
+ - transformer.single_transformer_blocks.36.attn.to_q
276
+ - transformer.single_transformer_blocks.36.attn.to_k
277
+ - transformer.single_transformer_blocks.36.attn.to_v
278
+ - transformer.single_transformer_blocks.37.attn.to_q
279
+ - transformer.single_transformer_blocks.37.attn.to_k
280
+ - transformer.single_transformer_blocks.37.attn.to_v
281
+ save:
282
+ dtype: float16
283
+ save_every: 2501
284
+ max_step_saves_to_keep: 1
285
+ datasets:
286
+ - folder_path: input_images
287
+ caption_ext: txt
288
+ caption_dropout_rate: 0.05
289
+ shuffle_tokens: false
290
+ cache_latents_to_disk: false
291
+ cache_latents: true
292
+ resolution:
293
+ - 512
294
+ - 768
295
+ - 1024
296
+ train:
297
+ batch_size: 1
298
+ steps: 2500
299
+ gradient_accumulation_steps: 1
300
+ train_unet: true
301
+ train_text_encoder: false
302
+ content_or_style: balanced
303
+ gradient_checkpointing: false
304
+ noise_scheduler: flowmatch
305
+ optimizer: adamw8bit
306
+ lr: 3.0e-05
307
+ ema_config:
308
+ use_ema: true
309
+ ema_decay: 0.99
310
+ dtype: bf16
311
+ model:
312
+ name_or_path: FLUX.1-dev
313
+ is_flux: true
314
+ quantize: false
315
+ sample:
316
+ sampler: flowmatch
317
+ sample_every: 2501
318
+ width: 1024
319
+ height: 1024
320
+ prompts: []
321
+ neg: ''
322
+ seed: 42
323
+ walk_seed: true
324
+ guidance_scale: 3.5
325
+ sample_steps: 28
326
+ meta:
327
+ name: flux_train_replicate
328
+ version: '1.0'
lora.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c8d9737abfe9e0efe401d9e48c7875b23bea864270e4dea1667515b05a3a4450
3
+ size 104666952