Update README.md
Browse files
README.md
CHANGED
|
@@ -1,7 +1,11 @@
|
|
| 1 |
https://huggingface.co/TenStrip/LTX2.3-10Eros_Workflows
|
| 2 |
|
|
|
|
|
|
|
|
|
|
| 3 |
BF16 loads as a checkpoint with clip and VAEs.
|
| 4 |
-
|
|
|
|
| 5 |
https://huggingface.co/Kijai/LTX2.3_comfy/tree/main
|
| 6 |
|
| 7 |
Larger distilled Loras will harm the model's fine tune, try the cond_safe ones:
|
|
|
|
| 1 |
https://huggingface.co/TenStrip/LTX2.3-10Eros_Workflows
|
| 2 |
|
| 3 |
+
Reliant on data from https://huggingface.co/SulphurAI/Sulphur-2-base
|
| 4 |
+
This is a different merge attempt for ideal I2V use. It uses layer scaled merges of different steps, it's not a straight weight merge. It behaves much nicer than lora load and respects prompt. Prompt should be enhanced, LTX has very little self reasoning inpuit, it must be guided and conditioned.
|
| 5 |
+
|
| 6 |
BF16 loads as a checkpoint with clip and VAEs.
|
| 7 |
+
|
| 8 |
+
Kijai split files are for 10Eros FP8 Transformer version, which goes inside diffusion_models:
|
| 9 |
https://huggingface.co/Kijai/LTX2.3_comfy/tree/main
|
| 10 |
|
| 11 |
Larger distilled Loras will harm the model's fine tune, try the cond_safe ones:
|