Exception during processing !!! Activation scale shape mismatch: scale.shape=torch.Size([96, 24]), expected (12288, 24)
#1
by MonsterMMORPG - opened
installed this : https://github.com/silveroxides/ComfyUI-QuantOps
testing 25-11
2025-12-23 23:34:32.587 [Debug] [ComfyUI-0/STDERR] !!! Exception during processing !!! Activation scale shape mismatch: scale.shape=torch.Size([96, 24]), expected (12288, 24)
2025-12-23 23:34:32.593 [Warning] [ComfyUI-0/STDERR] Traceback (most recent call last):
2025-12-23 23:34:32.594 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\execution.py", line 516, in execute
2025-12-23 23:34:32.594 [Warning] [ComfyUI-0/STDERR] output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
2025-12-23 23:34:32.594 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\execution.py", line 330, in get_output_data
2025-12-23 23:34:32.595 [Warning] [ComfyUI-0/STDERR] return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
2025-12-23 23:34:32.595 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\execution.py", line 304, in _async_map_node_over_list
2025-12-23 23:34:32.595 [Warning] [ComfyUI-0/STDERR] await process_inputs(input_dict, i)
2025-12-23 23:34:32.596 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\execution.py", line 292, in process_inputs
2025-12-23 23:34:32.596 [Warning] [ComfyUI-0/STDERR] result = f(**inputs)
2025-12-23 23:34:32.596 [Warning] [ComfyUI-0/STDERR] File "F:\SwarmUI_Model_Downloader_v81\SwarmUI\src\BuiltinExtensions\ComfyUIBackend\ExtraNodes\SwarmComfyCommon\SwarmKSampler.py", line 357, in run_sampling
2025-12-23 23:34:32.596 [Warning] [ComfyUI-0/STDERR] return self.sample(model, noise_seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, start_at_step, end_at_step, var_seed, var_seed_strength, sigma_max, sigma_min, rho, add_noise, return_with_leftover_noise, previews)
2025-12-23 23:34:32.597 [Warning] [ComfyUI-0/STDERR] File "F:\SwarmUI_Model_Downloader_v81\SwarmUI\src\BuiltinExtensions\ComfyUIBackend\ExtraNodes\SwarmComfyCommon\SwarmKSampler.py", line 331, in sample
2025-12-23 23:34:32.597 [Warning] [ComfyUI-0/STDERR] samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_samples,
2025-12-23 23:34:32.597 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\sample.py", line 60, in sample
2025-12-23 23:34:32.598 [Warning] [ComfyUI-0/STDERR] samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
2025-12-23 23:34:32.598 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\samplers.py", line 1178, in sample
2025-12-23 23:34:32.598 [Warning] [ComfyUI-0/STDERR] return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
2025-12-23 23:34:32.599 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\samplers.py", line 1068, in sample
2025-12-23 23:34:32.599 [Warning] [ComfyUI-0/STDERR] return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
2025-12-23 23:34:32.599 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\samplers.py", line 1050, in sample
2025-12-23 23:34:32.600 [Warning] [ComfyUI-0/STDERR] output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed, latent_shapes=latent_shapes)
2025-12-23 23:34:32.600 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\patcher_extension.py", line 112, in execute
2025-12-23 23:34:32.600 [Warning] [ComfyUI-0/STDERR] return self.original(*args, **kwargs)
2025-12-23 23:34:32.601 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\samplers.py", line 984, in outer_sample
2025-12-23 23:34:32.601 [Warning] [ComfyUI-0/STDERR] self.inner_model, self.conds, self.loaded_models = comfy.sampler_helpers.prepare_sampling(self.model_patcher, noise.shape, self.conds, self.model_options)
2025-12-23 23:34:32.602 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\sampler_helpers.py", line 130, in prepare_sampling
2025-12-23 23:34:32.602 [Warning] [ComfyUI-0/STDERR] return executor.execute(model, noise_shape, conds, model_options=model_options, force_full_load=force_full_load)
2025-12-23 23:34:32.602 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\patcher_extension.py", line 112, in execute
2025-12-23 23:34:32.603 [Warning] [ComfyUI-0/STDERR] return self.original(*args, **kwargs)
2025-12-23 23:34:32.603 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\sampler_helpers.py", line 138, in _prepare_sampling
2025-12-23 23:34:32.603 [Warning] [ComfyUI-0/STDERR] comfy.model_management.load_models_gpu([model] + models, memory_required=memory_required + inference_memory, minimum_memory_required=minimum_memory_required + inference_memory, force_full_load=force_full_load)
2025-12-23 23:34:32.604 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\model_management.py", line 704, in load_models_gpu
2025-12-23 23:34:32.604 [Warning] [ComfyUI-0/STDERR] loaded_model.model_load(lowvram_model_memory, force_patch_weights=force_patch_weights)
2025-12-23 23:34:32.604 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\model_management.py", line 509, in model_load
2025-12-23 23:34:32.604 [Warning] [ComfyUI-0/STDERR] self.model_use_more_vram(use_more_vram, force_patch_weights=force_patch_weights)
2025-12-23 23:34:32.605 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\model_management.py", line 539, in model_use_more_vram
2025-12-23 23:34:32.605 [Warning] [ComfyUI-0/STDERR] return self.model.partially_load(self.device, extra_memory, force_patch_weights=force_patch_weights)
2025-12-23 23:34:32.605 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\model_patcher.py", line 981, in partially_load
2025-12-23 23:34:32.606 [Warning] [ComfyUI-0/STDERR] raise e
2025-12-23 23:34:32.606 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\model_patcher.py", line 978, in partially_load
2025-12-23 23:34:32.606 [Warning] [ComfyUI-0/STDERR] self.load(device_to, lowvram_model_memory=current_used + extra_memory, force_patch_weights=force_patch_weights, full_load=full_load)
2025-12-23 23:34:32.607 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\model_patcher.py", line 777, in load
2025-12-23 23:34:32.608 [Warning] [ComfyUI-0/STDERR] self.patch_weight_to_device(key, device_to=device_to)
2025-12-23 23:34:32.608 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\model_patcher.py", line 630, in patch_weight_to_device
2025-12-23 23:34:32.608 [Warning] [ComfyUI-0/STDERR] temp_weight = convert_func(temp_weight, inplace=True)
2025-12-23 23:34:32.609 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\ops.py", line 619, in convert_weight
2025-12-23 23:34:32.609 [Warning] [ComfyUI-0/STDERR] return weight.dequantize()
2025-12-23 23:34:32.610 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\comfy\quant_ops.py", line 197, in dequantize
2025-12-23 23:34:32.610 [Warning] [ComfyUI-0/STDERR] return LAYOUTS[self._layout_type].dequantize(self._qdata, **self._layout_params)
2025-12-23 23:34:32.610 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\custom_nodes\ComfyUI-QuantOps\quant_layouts\int8_layout.py", line 277, in dequantize
2025-12-23 23:34:32.611 [Warning] [ComfyUI-0/STDERR] raise RuntimeError(
2025-12-23 23:34:32.611 [Warning] [ComfyUI-0/STDERR] RuntimeError: Activation scale shape mismatch: scale.shape=torch.Size([96, 24]), expected (12288, 24)
2025-12-23 23:34:32.611 [Warning] [ComfyUI-0/STDERR]
2025-12-23 23:34:32.612 [Debug] [ComfyUI-0/STDERR] Prompt executed in 3.20 seconds
2025-12-23 23:34:32.693 [Debug] Failed to process comfy workflow for inputs T2IParamInput(prompt: turn into pixel art, model: qwen_image_edit_2511_int8mixed, seed: 1652804466, steps: 12, cfgscale: 1, aspectratio: 16:9, width: 1744, height: 992, sampler: euler_ancestral, scheduler: simple, automaticvae: True, loras: Qwen-Image-Edit-2511-Lightning-4steps-V1.0-fp32, loraweights: 1, preferreddtype: default, promptimages: iVBORw0KGgoAAAANSUhEUgAADSAAAAeACAIAAADrN4/EAAAQAElEQVR4AVT9BcAexbU/AJ8zsruPvBLH3d3dXYtbaaG99fbeym2plwLFSSCGExIS3J0gxVqsuAWS4Bp79bHdHft+84T//b5v3nlmZ2fOHD9nZndTKh770D+4MDy0MDz8fnhkYazoPLww3j74XnhwQXjovfDQ++GhxeHhReHhhR710Q/Dyjr/g/DoB2H+h+GRDwIw3Puuu+tNe+ur..., negativeprompt: ) with raw workflow { "4": { "class_type": "UNETLoader", "inputs": { "unet_name": "qwen_image_edit_2511_int8mixed.safetensors", "weight_dtype": "default" } }, "100": { "class_type": "CLIPLoader", "inputs": { "clip_name": "qwen_2.5_vl_7b_fp8_scaled.safetensors", "type": "qwen_image", "device": "default" } }, "101": { "class_type": "VAELoader", "inputs": { "vae_name": "QwenImage\\qwen_image_vae.safetensors" } }, "102": { "class_type": "ModelSamplingAuraFlow", "inputs": { "model": [ "4", 0 ], "shift": 3 } }, "3000": { "class_type": "LoraLoaderModelOnly", "inputs": { "model": [ "102", 0 ], "lora_name": "Qwen-Image-Edit-2511-Lightning-4steps-V1.0-fp32.safetensors", "strength_model": 1 } }, "5": { "class_type": "EmptySD3LatentImage", "inputs": { "batch_size": 1, "height": 992, "width": 1744 } }, "103": { "class_type": "SwarmLoadImageB64", "inputs": { "image_base64": "iVBORw0KGgoAAAANSUhEUgAADSAAAAeACAIAAADrN4/EAAAQAElEQVR4AVT9BcAexbU/AJ8zsruPvBLH3d3dXYtbaaG99fbeym2plwLFSSCGExIS3J0gxVqsuAWS4Bp79bHdHft+84T//b5v3nlmZ2fOHD9nZndTKh770D+4MDy0MDz8fnhkYazoPLww3j74XnhwQXjovfDQ++GhxeHhReHhhR710Q/Dyjr/g/DoB2H+h+GRDwIw3Puuu+tNe..." } }, "104": { "class_type": "ImageScale", "inputs": { "image": [ "103", 0 ], "width": 512, "height": 320, "crop": "disabled", "upscale_method": "lanczos" } }, "6": { "class_type": "TextEncodeQwenImageEditPlus", "inputs": { "clip": [ "100", 0 ], "prompt": "turn into pixel art", "vae": null, "image1": [ "104", 0 ], "image2": null, "image3": null } }, "7": { "class_type": "TextEncodeQwenImageEditPlus", "inputs": { "clip": [ "100", 0 ], "prompt": "", "vae": null, "image1": [ "104", 0 ], "image2": null, "image3": null } }, "105": { "class_type": "ImageScale", "inputs": { "image": [ "103", 0 ], "width": 1344, "height": 768, "crop": "disabled", "upscale_method": "lanczos" } }, "106": { "class_type": "VAEEncode", "inputs": { "vae": [ "101", 0 ], "pixels": [ "105", 0 ] } }, "107": { "class_type": "ReferenceLatent", "inputs": { "conditioning": [ "6", 0 ], "latent": [ "106", 0 ] } }, "108": { "class_type": "ReferenceLatent", "inputs": { "conditioning": [ "7", 0 ], "latent": [ "106", 0 ] } }, "10": { "class_type": "SwarmKSampler", "inputs": { "model": [ "3000", 0 ], "noise_seed": 1652804466, "steps": 12, "cfg": 1, "sampler_name": "euler_ancestral", "scheduler": "simple", "positive": [ "107", 0 ], "negative": [ "108", 0 ], "latent_image": [ "5", 0 ], "start_at_step": 0, "end_at_step": 10000, "return_with_leftover_noise": "disable", "add_noise": "enable", "control_after_generate": "fixed", "var_seed": 0, "var_seed_strength": 0, "sigma_min": -1, "sigma_max": -1, "rho": 7, "previews": "default", "tile_sample": False, "tile_size": 1328 } }, "8": { "class_type": "VAEDecode", "inputs": { "vae": [ "101", 0 ], "samples": [ "10", 0 ] } }, "9": { "class_type": "SwarmSaveImageWS", "inputs": { "images": [ "8", 0 ], "bit_depth": "8bit" } } }
2025-12-23 23:34:32.694 [Debug] Refused to generate image for local: ComfyUI execution error: Activation scale shape mismatch: scale.shape=torch.Size([96, 24]), expected (12288, 24)
2025-12-23 23:34:37.012 [Debug] [ComfyUI-0/STDOUT] FETCH ComfyRegistry Data: 20/114
2025-12-23 23:34:41.941 [Debug] [ComfyUI-0/STDOUT] FETCH ComfyRegistry Data: 25/114
2025-12-23 23:34:46.049 [Debug] [ComfyUI-0/STDOUT] FETCH ComfyRegistry Data: 30/114
2025-12-23 23:34:50.326 [Debug] [ComfyUI-0/STDOUT] FETCH ComfyRegistry Data: 35/114
2025-12-23 23:34:54.504 [Debug] [ComfyUI-0/STDOUT] FETCH ComfyRegistry Data: 40/114
2025-12-23 23:34:58.666 [Debug] [ComfyUI-0/STDOUT] FETCH ComfyRegistry Data: 45/114
2025-12-23 23:35:02.845 [Debug] [ComfyUI-0/STDOUT] FETCH ComfyRegistry Data: 50/114
2025-12-23 23:35:06.866 [Debug] [ComfyUI-0/STDOUT] FETCH ComfyRegistry Data: 55/114
2025-12-23 23:35:11.105 [Debug] [ComfyUI-0/STDOUT] FETCH ComfyRegistry Data: 60/114
2025-12-23 23:35:15.797 [Debug] [ComfyUI-0/STDOUT] FETCH ComfyRegistry Data: 65/114
2025-12-23 23:35:19.901 [Debug] [ComfyUI-0/STDOUT] FETCH ComfyRegistry Data: 70/114
2025-12-23 23:35:24.221 [Debug] [ComfyUI-0/STDOUT] FETCH ComfyRegistry Data: 75/114
2025-12-23 23:35:28.910 [Debug] [ComfyUI-0/STDOUT] FETCH ComfyRegistry Data: 80/114
2025-12-23 23:35:33.407 [Debug] [ComfyUI-0/STDOUT] FETCH ComfyRegistry Data: 85/114
2025-12-23 23:34:32.610 [Warning] [ComfyUI-0/STDERR] File "F:\Comfy_UI_V52\ComfyUI\custom_nodes\ComfyUI-QuantOps\quant_layouts\int8_layout.py", line 277, in dequantize
2025-12-23 23:34:32.611 [Warning] [ComfyUI-0/STDERR] raise RuntimeError(
2025-12-23 23:34:32.611 [Warning] [ComfyUI-0/STDERR] RuntimeError: Activation scale shape mismatch: scale.shape=torch.Size([96, 24]), expected (12288, 24)
Those are experimental models. Have not had full time to test every scenario because limited hardware access at the moment. Also you do not specify which 2511 model you were testing nor did you provide any indication of node setting. Just a bunch of logs which does not display any of those things. So tbh, I can't do anything with that.