do you have 80gb gpu or ...

#2
by johndpope - opened

did you manage 128rank on 5090?

did you manage 128rank on 5090?

I have an 5090 RTX. I manage to use the LoRA in it for inference, but I never trained a LoRA in it. This specific LoRA was trained on a VM with RTX 6000 pro
Now that other trainers like AI Toolkit support fine-tuning on LTX-2 with lower VRAM I should try it.
And I'd love to try re-training a LoRA like this one on 32rank instead when I get the chance.

Sign up or log in to comment