LORA support?
Hi. First of all, thank you for the conversion of this new model. I have a question about (as the title) I've tested this model installing all the necessary things but the trained lora doesn't work at all with the regular lora loader. The inference is well done but the LORA is completely ignored. There is a way to make the lora works with your quantized SDNQ Int4 model? Thank you
lora is not currently supported. in theory it should be possible to merge the lora projections into the svd projections, but i'm not really sure how to do that on the ComfyUI side. i'll try to figure it out
I have implemented a very crude lora loader node (SDNQLoraLoader) that should work for now.
Currently, loras are not unloaded automatically. you have to manually unload the whole model if you want to remove a lora.
Changing the loras strength should work without unloading though.
Make sure to update the underlying library pip install -U git+https://github.com/kanttouchthis/hqqsvd
I have implemented a very crude lora loader node (SDNQLoraLoader) that should work for now.
Currently, loras are not unloaded automatically. you have to manually unload the whole model if you want to remove a lora.
Changing the loras strength should work without unloading though.
Make sure to update the underlying librarypip install -U git+https://github.com/kanttouchthis/hqqsvd
Great Thank you! i will test with pleasure! :)
Hey thank you for this. How is this compared to svdq nunchaku?
Hey thank you for this. How is this compared to svdq nunchaku?
this basically uses the same technique, but without calibration data, so quality is slightly worse. this doesn't need a custom model implementation though, which makes it easier adapt to new models. performance is pretty similar thanks to int8 matmul (if supported) and compiled dequantization functions
I just added support for loading loras via comfyui's builtin loraloader nodes, loras are now unloaded automatically, just like with normal models. SDNQLoraLoader was removed because it's kinda broken.