Update README.md
Browse files
README.md
CHANGED
|
@@ -78,6 +78,8 @@ Recommended Sampler: LCM
|
|
| 78 |
|
| 79 |
Recommended Scheduler: SIMPLE
|
| 80 |
|
|
|
|
|
|
|
| 81 |
This model was trained to 2000 steps, 2 repeats with a learning rate of 4e-4 trained with Simple Tuner using the main branch. The dataset was around 90 synthetic images in total. All of the images used were 1:1 aspect ratio at 1024x1024 to fit into VRAM.
|
| 82 |
|
| 83 |
Training took around 3 hours using an RTX 4090 with 24GB VRAM, training times are on par with Flux LoRA training. Captioning was done using Joy Caption Batch with modified instructions and a token limit of 128 tokens (more than that gets truncated during training).
|
|
|
|
| 78 |
|
| 79 |
Recommended Scheduler: SIMPLE
|
| 80 |
|
| 81 |
+
Recommended Strength: 0.3-0.6
|
| 82 |
+
|
| 83 |
This model was trained to 2000 steps, 2 repeats with a learning rate of 4e-4 trained with Simple Tuner using the main branch. The dataset was around 90 synthetic images in total. All of the images used were 1:1 aspect ratio at 1024x1024 to fit into VRAM.
|
| 84 |
|
| 85 |
Training took around 3 hours using an RTX 4090 with 24GB VRAM, training times are on par with Flux LoRA training. Captioning was done using Joy Caption Batch with modified instructions and a token limit of 128 tokens (more than that gets truncated during training).
|