refiner gguf's -old and the new v2- doesn't work with torch.compile or sage-attention

#6
by patientxtr - opened

The base model worked since day one like all other models does, but with this new refiner gguf's something is different , if torch.compile and patch sage-attention enabled I got this error : "BaseLoaderKJ._patch_modules..attention_sage() got an unexpected keyword argument 'transformer_options'"without these refiner works but takes 4x time almost

guess neither the node nor gguf files' problem; you might need to post it to the comfyui's repo/github

it was the node , will post the fix on their github thanks.

try to disable sage-attention and/or other unnecessary libraries may help

patientxtr changed discussion status to closed

Sign up or log in to comment