Update README.md
Browse files
README.md
CHANGED
|
@@ -5,7 +5,9 @@ This is a different merge attempt for ideal I2V use. It uses layer scaled merges
|
|
| 5 |
|
| 6 |
BF16 loads as a checkpoint with clip and VAEs.
|
| 7 |
|
| 8 |
-
|
|
|
|
|
|
|
| 9 |
https://huggingface.co/Kijai/LTX2.3_comfy/tree/main
|
| 10 |
|
| 11 |
!!! Larger distilled Loras will harm the model's fine tune, try the cond_safe ones:
|
|
|
|
| 5 |
|
| 6 |
BF16 loads as a checkpoint with clip and VAEs.
|
| 7 |
|
| 8 |
+
Fp8_mixed_learned is the better FP8 version and is a full checkpoint as well, quant by S1LV3RC01N.
|
| 9 |
+
|
| 10 |
+
Kijai split files are for 10Eros FP8 Transformer version, but it has a different structure and variance. That one goes inside diffusion_models:
|
| 11 |
https://huggingface.co/Kijai/LTX2.3_comfy/tree/main
|
| 12 |
|
| 13 |
!!! Larger distilled Loras will harm the model's fine tune, try the cond_safe ones:
|