Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SamplerCustomAdvanced Error - forward_orig() takes from 7 to 9 positional arguments but 10 were given #164

Open
patrickmalicek opened this issue Nov 15, 2024 · 1 comment

Comments

@patrickmalicek
Copy link

ComfyUI Error Report

Error Details

  • Node Type: SamplerCustomAdvanced
  • Exception Type: TypeError
  • Exception Message: forward_orig() takes from 7 to 9 positional arguments but 10 were given

Stack Trace

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy_extras\nodes_custom_sampler.py", line 633, in sample
    samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=noise.seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 740, in sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 719, in inner_sample
    samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 624, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\k_diffusion\sampling.py", line 155, in sample_euler
    denoised = model(x, sigma_hat * s_in, **extra_args)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 299, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 706, in __call__
    return self.predict_noise(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 709, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 279, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 228, in calc_cond_batch
    output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 144, in apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\ldm\flux\model.py", line 181, in forward
    out = self.forward_orig(img, img_ids, context, txt_ids, timestep, y, guidance, control, transformer_options)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

System Information

  • ComfyUI Version: v0.2.7-22-g122c9ca
  • Arguments: ComfyUI\main.py --windows-standalone-build
  • OS: nt
  • Python Version: 3.12.7 (tags/v3.12.7:0b05ead, Oct 1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]
  • Embedded Python: true
  • PyTorch Version: 2.5.1+cu124

Devices

  • Name: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
    • Type: cuda
    • VRAM Total: 25756696576
    • VRAM Free: 437564032
    • Torch VRAM Total: 23320330240
    • Torch VRAM Free: 80409216

Logs

2024-11-15T00:34:19.737410 - [START] Security scan2024-11-15T00:34:19.737410 - 
2024-11-15T00:34:20.415679 - [DONE] Security scan2024-11-15T00:34:20.415679 - 
2024-11-15T00:34:20.491836 - ## ComfyUI-Manager: installing dependencies done.2024-11-15T00:34:20.491836 - 
2024-11-15T00:34:20.491836 - ** ComfyUI startup time:2024-11-15T00:34:20.491836 -  2024-11-15T00:34:20.491836 - 2024-11-15 00:34:20.4918362024-11-15T00:34:20.491836 - 
2024-11-15T00:34:20.514654 - ** Platform:2024-11-15T00:34:20.514654 -  2024-11-15T00:34:20.514654 - Windows2024-11-15T00:34:20.514654 - 
2024-11-15T00:34:20.514654 - ** Python version:2024-11-15T00:34:20.514654 -  2024-11-15T00:34:20.514654 - 3.12.7 (tags/v3.12.7:0b05ead, Oct  1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]2024-11-15T00:34:20.514654 - 
2024-11-15T00:34:20.514654 - ** Python executable:2024-11-15T00:34:20.514654 -  2024-11-15T00:34:20.514654 - C:\ComfyUI\ComfyUI_windows_portable\python_embeded\python.exe2024-11-15T00:34:20.514654 - 
2024-11-15T00:34:20.514654 - ** ComfyUI Path:2024-11-15T00:34:20.514654 -  2024-11-15T00:34:20.514654 - C:\ComfyUI\ComfyUI_windows_portable\ComfyUI2024-11-15T00:34:20.514654 - 
2024-11-15T00:34:20.514654 - ** Log path:2024-11-15T00:34:20.514654 -  2024-11-15T00:34:20.514654 - C:\ComfyUI\ComfyUI_windows_portable\comfyui.log2024-11-15T00:34:20.515648 - 
2024-11-15T00:34:20.520643 - 
Prestartup times for custom nodes:2024-11-15T00:34:20.520643 - 
2024-11-15T00:34:20.520643 -    0.0 seconds:2024-11-15T00:34:20.520643 -  2024-11-15T00:34:20.520643 - C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy2024-11-15T00:34:20.520643 - 
2024-11-15T00:34:20.520643 -    0.8 seconds:2024-11-15T00:34:20.520643 -  2024-11-15T00:34:20.520643 - C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Manager2024-11-15T00:34:20.520643 - 
2024-11-15T00:34:20.520643 - 
2024-11-15T00:34:21.906042 - Total VRAM 24564 MB, total RAM 65143 MB
2024-11-15T00:34:21.906042 - pytorch version: 2.5.1+cu124
2024-11-15T00:34:21.907042 - Set vram state to: NORMAL_VRAM
2024-11-15T00:34:21.907042 - Device: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
2024-11-15T00:34:22.767924 - Using pytorch cross attention
2024-11-15T00:34:23.889947 - [Prompt Server] web root: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\web
2024-11-15T00:34:24.595925 - ### Loading: ComfyUI-Impact-Pack (V7.11.3)2024-11-15T00:34:24.595925 - 
2024-11-15T00:34:24.765310 - ### Loading: ComfyUI-Impact-Pack (Subpack: V0.8)2024-11-15T00:34:24.765310 - 
2024-11-15T00:34:24.790351 - [Impact Pack] Wildcards loading done.2024-11-15T00:34:24.790351 - 
2024-11-15T00:34:24.799348 - Total VRAM 24564 MB, total RAM 65143 MB
2024-11-15T00:34:24.799348 - pytorch version: 2.5.1+cu124
2024-11-15T00:34:24.800348 - Set vram state to: NORMAL_VRAM
2024-11-15T00:34:24.800348 - Device: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
2024-11-15T00:34:24.823028 - ### Loading: ComfyUI-Manager (V2.51.9)2024-11-15T00:34:24.823028 - 
2024-11-15T00:34:24.892074 - ### ComfyUI Revision: 2830 [122c9ca1] | Released on '2024-11-14'2024-11-15T00:34:24.892074 - 
2024-11-15T00:34:25.283375 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json2024-11-15T00:34:25.288375 - 
2024-11-15T00:34:25.354775 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json2024-11-15T00:34:25.354775 - 
2024-11-15T00:34:25.355776 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json2024-11-15T00:34:25.355776 - 
2024-11-15T00:34:25.368775 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json2024-11-15T00:34:25.368775 - 
2024-11-15T00:34:25.382775 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json2024-11-15T00:34:25.382775 - 
2024-11-15T00:34:25.693359 - C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\timm\models\layers\__init__.py:48: FutureWarning: Importing from timm.models.layers is deprecated, please import via timm.layers
  warnings.warn(f"Importing from {__name__} is deprecated, please import via timm.layers", FutureWarning)
2024-11-15T00:34:25.696368 - Please 'pip install xformers'2024-11-15T00:34:25.696368 - 
2024-11-15T00:34:25.696368 - Nvidia APEX normalization not installed, using PyTorch LayerNorm2024-11-15T00:34:25.697369 - 
2024-11-15T00:34:25.775841 - Please 'pip install xformers'2024-11-15T00:34:25.775841 - 
2024-11-15T00:34:25.776843 - Nvidia APEX normalization not installed, using PyTorch LayerNorm2024-11-15T00:34:25.776843 - 
2024-11-15T00:34:25.829961 - �[0;33m[ReActor]�[0m - �[38;5;173mSTATUS�[0m - �[0;32mRunning v0.5.1-b2 in ComfyUI�[0m2024-11-15T00:34:25.829961 - 
2024-11-15T00:34:25.934785 - Torch version: 2.5.1+cu1242024-11-15T00:34:25.934785 - 
2024-11-15T00:34:25.966286 - ------------------------------------------2024-11-15T00:34:25.966286 - 
2024-11-15T00:34:25.966286 - �[34mComfyroll Studio v1.76 : �[92m 175 Nodes Loaded�[0m2024-11-15T00:34:25.966286 - 
2024-11-15T00:34:25.966286 - ------------------------------------------2024-11-15T00:34:25.966286 - 
2024-11-15T00:34:25.966286 - ** For changes, please see patch notes at https://github.com/Suzie1/ComfyUI_Comfyroll_CustomNodes/blob/main/Patch_Notes.md2024-11-15T00:34:25.966286 - 
2024-11-15T00:34:25.966286 - ** For help, please see the wiki at https://github.com/Suzie1/ComfyUI_Comfyroll_CustomNodes/wiki2024-11-15T00:34:25.966286 - 
2024-11-15T00:34:25.966286 - ------------------------------------------2024-11-15T00:34:25.966286 - 
2024-11-15T00:34:25.970286 - �[36;20m[comfyui_controlnet_aux] | INFO -> Using ckpts path: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui_controlnet_aux\ckpts�[0m
2024-11-15T00:34:25.970286 - �[36;20m[comfyui_controlnet_aux] | INFO -> Using symlinks: False�[0m
2024-11-15T00:34:25.970286 - �[36;20m[comfyui_controlnet_aux] | INFO -> Using ort providers: ['CUDAExecutionProvider', 'DirectMLExecutionProvider', 'OpenVINOExecutionProvider', 'ROCMExecutionProvider', 'CPUExecutionProvider', 'CoreMLExecutionProvider']�[0m
2024-11-15T00:34:26.015164 - 
2024-11-15T00:34:26.015164 - �[92m[rgthree-comfy] Loaded 42 fantastic nodes. 🎉�[00m2024-11-15T00:34:26.015164 - 
2024-11-15T00:34:26.015164 - 
2024-11-15T00:34:26.046934 - 
Import times for custom nodes:
2024-11-15T00:34:26.046934 -    0.0 seconds: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\websocket_image_save.py
2024-11-15T00:34:26.046934 -    0.0 seconds: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Custom-Scripts
2024-11-15T00:34:26.046934 -    0.0 seconds: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-GGUF
2024-11-15T00:34:26.046934 -    0.0 seconds: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\rgthree-comfy
2024-11-15T00:34:26.046934 -    0.0 seconds: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_Comfyroll_CustomNodes
2024-11-15T00:34:26.046934 -    0.0 seconds: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-KJNodes
2024-11-15T00:34:26.047935 -    0.0 seconds: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\x-flux-comfyui
2024-11-15T00:34:26.047935 -    0.0 seconds: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui_controlnet_aux
2024-11-15T00:34:26.047935 -    0.1 seconds: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-PuLID-Flux-Enhanced
2024-11-15T00:34:26.047935 -    0.1 seconds: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-reactor-node
2024-11-15T00:34:26.047935 -    0.2 seconds: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Impact-Pack
2024-11-15T00:34:26.047935 -    0.3 seconds: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Manager
2024-11-15T00:34:26.047935 -    0.3 seconds: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Florence2
2024-11-15T00:34:26.047935 -    0.7 seconds: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-PuLID-Flux
2024-11-15T00:34:26.047935 - 
2024-11-15T00:34:26.052935 - Starting server

2024-11-15T00:34:26.053936 - To see the GUI go to: http://127.0.0.1:8188
2024-11-15T00:34:26.577386 - FETCH DATA from: C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-Manager\extension-node-map.json2024-11-15T00:34:26.577386 - 2024-11-15T00:34:26.580464 -  [DONE]2024-11-15T00:34:26.580464 - 
2024-11-15T00:34:27.065824 - ['AndroFlux-v19.safetensors', 'C64_Flux.safetensors', 'aidmaFluxStyleXL-v0.2.safetensors', 'anime_lora.safetensors', 'anime_lora_comfy_converted.safetensors', 'art_lora.safetensors', 'art_lora_comfy_converted.safetensors', 'detailer_flux_v1.safetensors', 'disney_lora.safetensors', 'disney_lora_comfy_converted.safetensors', 'flux_realism_lora.safetensors', 'furry_lora.safetensors', 'mjv6_lora.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-11-15T00:34:27.065824 - 
2024-11-15T00:34:27.065824 - ['AndroFlux-v19.safetensors', 'C64_Flux.safetensors', 'aidmaFluxStyleXL-v0.2.safetensors', 'anime_lora.safetensors', 'anime_lora_comfy_converted.safetensors', 'art_lora.safetensors', 'art_lora_comfy_converted.safetensors', 'detailer_flux_v1.safetensors', 'disney_lora.safetensors', 'disney_lora_comfy_converted.safetensors', 'flux_realism_lora.safetensors', 'furry_lora.safetensors', 'mjv6_lora.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-11-15T00:34:27.065824 - 
2024-11-15T00:34:31.291902 - got prompt
2024-11-15T00:34:31.307902 - using sdpa for attention2024-11-15T00:34:31.308904 - 
2024-11-15T00:34:31.363772 - No flash_attn import to remove2024-11-15T00:34:31.363772 - 
2024-11-15T00:34:33.419633 - </s><s>The image is a portrait of a middle-aged man with a big smile on his face. He is standing in an office with his arms crossed in front of him. He has grey hair and a beard, and is wearing a navy blue suit with a white shirt and a white pocket square. He also has a watch on his left wrist. The background is blurred, but it appears to be an indoor space with plants and a wooden ceiling. The overall mood of the image is happy and confident.</s>2024-11-15T00:34:33.419633 - 
2024-11-15T00:34:33.419633 - Offloading model...2024-11-15T00:34:33.420629 - 
2024-11-15T00:34:33.592346 - Using pytorch attention in VAE
2024-11-15T00:34:33.593349 - Using pytorch attention in VAE
2024-11-15T00:34:34.067273 - Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}}2024-11-15T00:34:34.067965 - 
2024-11-15T00:34:34.116432 - find model:2024-11-15T00:34:34.116432 -  2024-11-15T00:34:34.116432 - C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\models\insightface\models\antelopev2\1k3d68.onnx2024-11-15T00:34:34.116432 -  2024-11-15T00:34:34.116432 - landmark_3d_682024-11-15T00:34:34.116432 -  2024-11-15T00:34:34.116432 - ['None', 3, 192, 192]2024-11-15T00:34:34.116432 -  2024-11-15T00:34:34.117434 - 0.02024-11-15T00:34:34.117434 -  2024-11-15T00:34:34.117434 - 1.02024-11-15T00:34:34.117592 - 
2024-11-15T00:34:34.136112 - Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}}2024-11-15T00:34:34.136112 - 
2024-11-15T00:34:34.139111 - find model:2024-11-15T00:34:34.139111 -  2024-11-15T00:34:34.139111 - C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\models\insightface\models\antelopev2\2d106det.onnx2024-11-15T00:34:34.139111 -  2024-11-15T00:34:34.139111 - landmark_2d_1062024-11-15T00:34:34.139111 -  2024-11-15T00:34:34.139111 - ['None', 3, 192, 192]2024-11-15T00:34:34.139111 -  2024-11-15T00:34:34.139111 - 0.02024-11-15T00:34:34.139111 -  2024-11-15T00:34:34.139111 - 1.02024-11-15T00:34:34.139111 - 
2024-11-15T00:34:34.149110 - Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}}2024-11-15T00:34:34.149110 - 
2024-11-15T00:34:34.150111 - find model:2024-11-15T00:34:34.150111 -  2024-11-15T00:34:34.150111 - C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\models\insightface\models\antelopev2\genderage.onnx2024-11-15T00:34:34.150111 -  2024-11-15T00:34:34.150111 - genderage2024-11-15T00:34:34.150111 -  2024-11-15T00:34:34.150111 - ['None', 3, 96, 96]2024-11-15T00:34:34.151110 -  2024-11-15T00:34:34.151110 - 0.02024-11-15T00:34:34.151110 -  2024-11-15T00:34:34.151110 - 1.02024-11-15T00:34:34.151110 - 
2024-11-15T00:34:34.428209 - Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}}2024-11-15T00:34:34.429236 - 
2024-11-15T00:34:34.509246 - find model:2024-11-15T00:34:34.509246 -  2024-11-15T00:34:34.509246 - C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\models\insightface\models\antelopev2\glintr100.onnx2024-11-15T00:34:34.509246 -  2024-11-15T00:34:34.509246 - recognition2024-11-15T00:34:34.509246 -  2024-11-15T00:34:34.509246 - ['None', 3, 112, 112]2024-11-15T00:34:34.509246 -  2024-11-15T00:34:34.509246 - 127.52024-11-15T00:34:34.509246 -  2024-11-15T00:34:34.509246 - 127.52024-11-15T00:34:34.509246 - 
2024-11-15T00:34:34.547828 - Applied providers: ['CPUExecutionProvider'], with options: {'CPUExecutionProvider': {}}2024-11-15T00:34:34.547828 - 
2024-11-15T00:34:34.547828 - find model:2024-11-15T00:34:34.548297 -  2024-11-15T00:34:34.548297 - C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\models\insightface\models\antelopev2\scrfd_10g_bnkps.onnx2024-11-15T00:34:34.548297 -  2024-11-15T00:34:34.548380 - detection2024-11-15T00:34:34.548380 -  2024-11-15T00:34:34.548380 - [1, 3, '?', '?']2024-11-15T00:34:34.548380 -  2024-11-15T00:34:34.548380 - 127.52024-11-15T00:34:34.548380 -  2024-11-15T00:34:34.548380 - 128.02024-11-15T00:34:34.548380 - 
2024-11-15T00:34:34.548380 - set det-size:2024-11-15T00:34:34.548645 -  2024-11-15T00:34:34.548645 - (640, 640)2024-11-15T00:34:34.548645 - 
2024-11-15T00:34:34.548717 - Loaded EVA02-CLIP-L-14-336 model config.
2024-11-15T00:34:34.564727 - Shape of rope freq: torch.Size([576, 64])
2024-11-15T00:34:38.020809 - Loading pretrained EVA02-CLIP-L-14-336 weights (eva_clip).
2024-11-15T00:34:38.466306 - incompatible_keys.missing_keys: ['visual.rope.freqs_cos', 'visual.rope.freqs_sin', 'visual.blocks.0.attn.rope.freqs_cos', 'visual.blocks.0.attn.rope.freqs_sin', 'visual.blocks.1.attn.rope.freqs_cos', 'visual.blocks.1.attn.rope.freqs_sin', 'visual.blocks.2.attn.rope.freqs_cos', 'visual.blocks.2.attn.rope.freqs_sin', 'visual.blocks.3.attn.rope.freqs_cos', 'visual.blocks.3.attn.rope.freqs_sin', 'visual.blocks.4.attn.rope.freqs_cos', 'visual.blocks.4.attn.rope.freqs_sin', 'visual.blocks.5.attn.rope.freqs_cos', 'visual.blocks.5.attn.rope.freqs_sin', 'visual.blocks.6.attn.rope.freqs_cos', 'visual.blocks.6.attn.rope.freqs_sin', 'visual.blocks.7.attn.rope.freqs_cos', 'visual.blocks.7.attn.rope.freqs_sin', 'visual.blocks.8.attn.rope.freqs_cos', 'visual.blocks.8.attn.rope.freqs_sin', 'visual.blocks.9.attn.rope.freqs_cos', 'visual.blocks.9.attn.rope.freqs_sin', 'visual.blocks.10.attn.rope.freqs_cos', 'visual.blocks.10.attn.rope.freqs_sin', 'visual.blocks.11.attn.rope.freqs_cos', 'visual.blocks.11.attn.rope.freqs_sin', 'visual.blocks.12.attn.rope.freqs_cos', 'visual.blocks.12.attn.rope.freqs_sin', 'visual.blocks.13.attn.rope.freqs_cos', 'visual.blocks.13.attn.rope.freqs_sin', 'visual.blocks.14.attn.rope.freqs_cos', 'visual.blocks.14.attn.rope.freqs_sin', 'visual.blocks.15.attn.rope.freqs_cos', 'visual.blocks.15.attn.rope.freqs_sin', 'visual.blocks.16.attn.rope.freqs_cos', 'visual.blocks.16.attn.rope.freqs_sin', 'visual.blocks.17.attn.rope.freqs_cos', 'visual.blocks.17.attn.rope.freqs_sin', 'visual.blocks.18.attn.rope.freqs_cos', 'visual.blocks.18.attn.rope.freqs_sin', 'visual.blocks.19.attn.rope.freqs_cos', 'visual.blocks.19.attn.rope.freqs_sin', 'visual.blocks.20.attn.rope.freqs_cos', 'visual.blocks.20.attn.rope.freqs_sin', 'visual.blocks.21.attn.rope.freqs_cos', 'visual.blocks.21.attn.rope.freqs_sin', 'visual.blocks.22.attn.rope.freqs_cos', 'visual.blocks.22.attn.rope.freqs_sin', 'visual.blocks.23.attn.rope.freqs_cos', 'visual.blocks.23.attn.rope.freqs_sin']
2024-11-15T00:34:40.153833 - Loading PuLID-Flux model.
2024-11-15T00:34:40.725486 - model weight dtype torch.bfloat16, manual cast: None
2024-11-15T00:34:40.725486 - model_type FLUX
2024-11-15T00:34:56.300089 - Requested to load FluxClipModel_
2024-11-15T00:34:56.300089 - Loading 1 new model
2024-11-15T00:34:56.310087 - loaded completely 0.0 9319.23095703125 True
2024-11-15T00:35:02.667382 - clip missing: ['text_projection.weight']
2024-11-15T00:35:03.690378 - Requested to load Flux
2024-11-15T00:35:03.690378 - Loading 1 new model
2024-11-15T00:35:10.861110 - loaded partially 19672.41067285156 20461.23046875 0
2024-11-15T00:35:10.870888 - 
  0%|                                                                                           | 0/15 [00:00<?, ?it/s]2024-11-15T00:35:10.872888 - 
  0%|                                                                                           | 0/15 [00:00<?, ?it/s]2024-11-15T00:35:10.872888 - 
2024-11-15T00:35:10.874888 - !!! Exception during processing !!! forward_orig() takes from 7 to 9 positional arguments but 10 were given
2024-11-15T00:35:10.877888 - Traceback (most recent call last):
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy_extras\nodes_custom_sampler.py", line 633, in sample
    samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=noise.seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 740, in sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 719, in inner_sample
    samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 624, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\k_diffusion\sampling.py", line 155, in sample_euler
    denoised = model(x, sigma_hat * s_in, **extra_args)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 299, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 706, in __call__
    return self.predict_noise(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 709, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 279, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 228, in calc_cond_batch
    output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 144, in apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\ldm\flux\model.py", line 181, in forward
    out = self.forward_orig(img, img_ids, context, txt_ids, timestep, y, guidance, control, transformer_options)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: forward_orig() takes from 7 to 9 positional arguments but 10 were given

2024-11-15T00:35:10.879888 - Prompt executed in 39.58 seconds
2024-11-15T00:49:47.969528 - ['AndroFlux-v19.safetensors', 'C64_Flux.safetensors', 'aidmaFluxStyleXL-v0.2.safetensors', 'anime_lora.safetensors', 'anime_lora_comfy_converted.safetensors', 'art_lora.safetensors', 'art_lora_comfy_converted.safetensors', 'detailer_flux_v1.safetensors', 'disney_lora.safetensors', 'disney_lora_comfy_converted.safetensors', 'flux_realism_lora.safetensors', 'furry_lora.safetensors', 'mjv6_lora.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-11-15T00:49:47.969528 - 
2024-11-15T00:49:47.969528 - ['AndroFlux-v19.safetensors', 'C64_Flux.safetensors', 'aidmaFluxStyleXL-v0.2.safetensors', 'anime_lora.safetensors', 'anime_lora_comfy_converted.safetensors', 'art_lora.safetensors', 'art_lora_comfy_converted.safetensors', 'detailer_flux_v1.safetensors', 'disney_lora.safetensors', 'disney_lora_comfy_converted.safetensors', 'flux_realism_lora.safetensors', 'furry_lora.safetensors', 'mjv6_lora.safetensors', 'mjv6_lora_comfy_converted.safetensors', 'realism_lora.safetensors', 'realism_lora_comfy_converted.safetensors', 'scenery_lora.safetensors', 'scenery_lora_comfy_converted.safetensors']2024-11-15T00:49:47.969528 - 
2024-11-15T00:49:52.337081 - got prompt
2024-11-15T00:49:54.185866 - Loading PuLID-Flux model.
2024-11-15T00:49:56.043822 - Unloading models for lowram load.
2024-11-15T00:49:56.411846 - 0 models unloaded.
2024-11-15T00:49:56.413850 - 
  0%|                                                                                           | 0/15 [00:00<?, ?it/s]2024-11-15T00:49:56.414851 - 
  0%|                                                                                           | 0/15 [00:00<?, ?it/s]2024-11-15T00:49:56.414851 - 
2024-11-15T00:49:56.415850 - !!! Exception during processing !!! forward_orig() takes from 7 to 9 positional arguments but 10 were given
2024-11-15T00:49:56.416850 - Traceback (most recent call last):
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy_extras\nodes_custom_sampler.py", line 633, in sample
    samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=noise.seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 740, in sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 719, in inner_sample
    samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 624, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\k_diffusion\sampling.py", line 155, in sample_euler
    denoised = model(x, sigma_hat * s_in, **extra_args)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 299, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 706, in __call__
    return self.predict_noise(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 709, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 279, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 228, in calc_cond_batch
    output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 144, in apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\ldm\flux\model.py", line 181, in forward
    out = self.forward_orig(img, img_ids, context, txt_ids, timestep, y, guidance, control, transformer_options)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: forward_orig() takes from 7 to 9 positional arguments but 10 were given

2024-11-15T00:49:56.418850 - Prompt executed in 4.08 seconds
2024-11-15T00:51:47.201712 - got prompt
2024-11-15T00:51:48.466613 - </s><s>The image is a portrait of a middle-aged man with a big smile on his face. He is standing in an office with his arms crossed in front of him. He has grey hair and a beard, and is wearing a navy blue suit with a white shirt and a white pocket square. He also has a watch on his left wrist. The background is blurred, but it appears to be an indoor space with plants and a wooden ceiling. The man looks confident and happy.</s>2024-11-15T00:51:48.466613 - 
2024-11-15T00:51:48.466613 - Offloading model...2024-11-15T00:51:48.466613 - 
2024-11-15T00:51:48.550399 - Requested to load FluxClipModel_
2024-11-15T00:51:48.550399 - Loading 1 new model
2024-11-15T00:51:52.917071 - loaded completely 0.0 9319.23095703125 True
2024-11-15T00:51:56.772847 - loaded partially 19672.41067285156 20461.23046875 0
2024-11-15T00:51:56.773848 - 
  0%|                                                                                           | 0/15 [00:00<?, ?it/s]2024-11-15T00:51:56.774846 - 
  0%|                                                                                           | 0/15 [00:00<?, ?it/s]2024-11-15T00:51:56.774846 - 
2024-11-15T00:51:56.775848 - !!! Exception during processing !!! forward_orig() takes from 7 to 9 positional arguments but 10 were given
2024-11-15T00:51:56.776848 - Traceback (most recent call last):
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 323, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 198, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 169, in _map_node_over_list
    process_inputs(input_dict, i)
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\execution.py", line 158, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy_extras\nodes_custom_sampler.py", line 633, in sample
    samples = guider.sample(noise.generate_noise(latent), latent_image, sampler, sigmas, denoise_mask=noise_mask, callback=callback, disable_pbar=disable_pbar, seed=noise.seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 740, in sample
    output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 719, in inner_sample
    samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 624, in sample
    samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, **self.extra_options)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\k_diffusion\sampling.py", line 155, in sample_euler
    denoised = model(x, sigma_hat * s_in, **extra_args)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 299, in __call__
    out = self.inner_model(x, sigma, model_options=model_options, seed=seed)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 706, in __call__
    return self.predict_noise(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 709, in predict_noise
    return sampling_function(self.inner_model, x, timestep, self.conds.get("negative", None), self.conds.get("positive", None), self.cfg, model_options=model_options, seed=seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 279, in sampling_function
    out = calc_cond_batch(model, conds, x, timestep, model_options)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\samplers.py", line 228, in calc_cond_batch
    output = model.apply_model(input_x, timestep_, **c).chunk(batch_chunks)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 144, in apply_model
    model_output = self.diffusion_model(xc, t, context=context, control=control, transformer_options=transformer_options, **extra_conds).float()
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI_windows_portable\ComfyUI\comfy\ldm\flux\model.py", line 181, in forward
    out = self.forward_orig(img, img_ids, context, txt_ids, timestep, y, guidance, control, transformer_options)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: forward_orig() takes from 7 to 9 positional arguments but 10 were given

2024-11-15T00:51:56.777848 - Prompt executed in 9.57 seconds

Attached Workflow

Please make sure that workflow does not contain any sensitive information such as API keys or passwords.

{"last_node_id":124,"last_link_id":156,"nodes":[{"id":100,"type":"ApplyPulidFlux","pos":{"0":776,"1":993},"size":{"0":320,"1":346},"flags":{},"order":14,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":145,"label":"model"},{"name":"pulid_flux","type":"PULIDFLUX","link":133,"label":"pulid_flux"},{"name":"eva_clip","type":"EVA_CLIP","link":137,"label":"eva_clip"},{"name":"face_analysis","type":"FACEANALYSIS","link":135,"label":"face_analysis"},{"name":"image","type":"IMAGE","link":136,"label":"image"},{"name":"attn_mask","type":"MASK","link":null,"shape":7,"label":"attn_mask"},{"name":"prior_image","type":"IMAGE","link":null,"shape":7}],"outputs":[{"name":"MODEL","type":"MODEL","links":[141,142],"slot_index":0,"label":"MODEL"}],"properties":{"Node name for S&R":"ApplyPulidFlux"},"widgets_values":[1,0.4,1,"mean",1,0,1000,true]},{"id":101,"type":"PulidFluxEvaClipLoader","pos":{"0":384,"1":1190},"size":{"0":320,"1":30},"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"EVA_CLIP","type":"EVA_CLIP","links":[137],"slot_index":0,"label":"EVA_CLIP"}],"properties":{"Node name for S&R":"PulidFluxEvaClipLoader"},"widgets_values":[]},{"id":97,"type":"PulidFluxInsightFaceLoader","pos":{"0":383,"1":943},"size":{"0":320,"1":70},"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[{"name":"FACEANALYSIS","type":"FACEANALYSIS","links":[135],"slot_index":0,"label":"FACEANALYSIS"}],"properties":{"Node name for S&R":"PulidFluxInsightFaceLoader"},"widgets_values":["CUDA"]},{"id":10,"type":"VAELoader","pos":{"0":385,"1":803},"size":{"0":320,"1":60},"flags":{},"order":2,"mode":0,"inputs":[],"outputs":[{"name":"VAE","type":"VAE","links":[12],"slot_index":0,"shape":3,"label":"VAE"}],"properties":{"Node name for S&R":"VAELoader"},"widgets_values":["ae.safetensors"]},{"id":6,"type":"CLIPTextEncode","pos":{"0":394,"1":684},"size":{"0":298.5097961425781,"1":65.9732894897461},"flags":{"collapsed":false},"order":19,"mode":0,"inputs":[{"name":"clip","type":"CLIP","link":108,"label":"clip"},{"name":"text","type":"STRING","link":148,"widget":{"name":"text"}}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[86],"slot_index":0,"label":"CONDITIONING"}],"properties":{"Node name for S&R":"CLIPTextEncode"},"widgets_values":["The image shows a model walking down a runway in a gold dress. The dress is form-fitting and has a deep V-neckline with thin straps. The skirt of the dress is draped over the model's body, creating a dramatic silhouette. The model has long blonde hair styled in loose waves and is looking directly at the camera with a serious expression. In the background, there are other models walking down the runway and a crowd of people sitting on either side of the runway. The lighting is dim and the overall mood of the image is elegant and sophisticated."]},{"id":11,"type":"DualCLIPLoader","pos":{"0":387,"1":496},"size":{"0":320,"1":110},"flags":{},"order":3,"mode":0,"inputs":[],"outputs":[{"name":"CLIP","type":"CLIP","links":[108],"slot_index":0,"shape":3,"label":"CLIP"}],"properties":{"Node name for S&R":"DualCLIPLoader"},"widgets_values":["t5xxl_fp16.safetensors","clip_l.safetensors","flux"]},{"id":116,"type":"ShowText|pysssss","pos":{"0":33,"1":181},"size":{"0":315,"1":76},"flags":{},"order":16,"mode":0,"inputs":[{"name":"text","type":"STRING","link":149,"widget":{"name":"text"}}],"outputs":[{"name":"STRING","type":"STRING","links":[148],"slot_index":0,"shape":6}],"properties":{"Node name for S&R":"ShowText|pysssss"},"widgets_values":["","The image is a portrait of a middle-aged man with a big smile on his face. He is standing in an office with his arms crossed in front of him. He has grey hair and a beard, and is wearing a navy blue suit with a white shirt and a white pocket square. He also has a watch on his left wrist. The background is blurred, but it appears to be an indoor space with plants and a wooden ceiling. The man looks confident and happy."]},{"id":113,"type":"Florence2Run","pos":{"0":23,"1":473},"size":{"0":340.51702880859375,"1":352},"flags":{},"order":13,"mode":0,"inputs":[{"name":"image","type":"IMAGE","link":147},{"name":"florence2_model","type":"FL2MODEL","link":146}],"outputs":[{"name":"image","type":"IMAGE","links":null},{"name":"mask","type":"MASK","links":null},{"name":"caption","type":"STRING","links":[149]},{"name":"data","type":"JSON","links":null}],"properties":{"Node name for S&R":"Florence2Run"},"widgets_values":["","more_detailed_caption",true,false,1024,3,true,"",33296997132846,"randomize"]},{"id":9,"type":"SaveImage","pos":{"0":1768,"1":200},"size":{"0":771.3804931640625,"1":930.574462890625},"flags":{},"order":24,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":9,"label":"images"}],"outputs":[],"properties":{"Node name for S&R":"SaveImage"},"widgets_values":["FLUX"]},{"id":117,"type":"GetImageSizeAndCount","pos":{"0":807,"1":664},"size":{"0":277.20001220703125,"1":86},"flags":{},"order":15,"mode":0,"inputs":[{"name":"image","type":"IMAGE","link":156}],"outputs":[{"name":"image","type":"IMAGE","links":null},{"name":"1024 width","type":"INT","links":[150],"slot_index":1},{"name":"1024 height","type":"INT","links":[151],"slot_index":2},{"name":"1 count","type":"INT","links":null}],"properties":{"Node name for S&R":"GetImageSizeAndCount"},"widgets_values":[]},{"id":120,"type":"ImageScaleToTotalPixels","pos":{"0":784,"1":818},"size":{"0":315,"1":82},"flags":{},"order":12,"mode":0,"inputs":[{"name":"image","type":"IMAGE","link":155}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[156],"slot_index":0}],"properties":{"Node name for S&R":"ImageScaleToTotalPixels"},"widgets_values":["nearest-exact",1]},{"id":102,"type":"EmptyLatentImage","pos":{"0":779,"1":497},"size":{"0":320,"1":110},"flags":{},"order":18,"mode":0,"inputs":[{"name":"height","type":"INT","link":151,"widget":{"name":"height"}},{"name":"width","type":"INT","link":150,"widget":{"name":"width"}}],"outputs":[{"name":"LATENT","type":"LATENT","links":[140],"slot_index":0,"label":"LATENT"}],"properties":{"Node name for S&R":"EmptyLatentImage"},"widgets_values":[768,1024,1]},{"id":60,"type":"FluxGuidance","pos":{"0":837,"1":358},"size":{"0":211.60000610351562,"1":60},"flags":{},"order":20,"mode":0,"inputs":[{"name":"conditioning","type":"CONDITIONING","link":86,"label":"conditioning"}],"outputs":[{"name":"CONDITIONING","type":"CONDITIONING","links":[87],"slot_index":0,"shape":3,"label":"CONDITIONING"}],"properties":{"Node name for S&R":"FluxGuidance"},"widgets_values":[3]},{"id":17,"type":"BasicScheduler","pos":{"0":1167,"1":688},"size":{"0":260,"1":110},"flags":{},"order":17,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":141,"slot_index":0,"label":"model"}],"outputs":[{"name":"SIGMAS","type":"SIGMAS","links":[20],"shape":3,"label":"SIGMAS"}],"properties":{"Node name for S&R":"BasicScheduler"},"widgets_values":["simple",15,1]},{"id":16,"type":"KSamplerSelect","pos":{"0":1165,"1":543},"size":{"0":260,"1":60},"flags":{},"order":4,"mode":0,"inputs":[],"outputs":[{"name":"SAMPLER","type":"SAMPLER","links":[19],"shape":3,"label":"SAMPLER"}],"properties":{"Node name for S&R":"KSamplerSelect"},"widgets_values":["euler"]},{"id":22,"type":"BasicGuider","pos":{"0":1165,"1":421},"size":{"0":260,"1":60},"flags":{"collapsed":false},"order":21,"mode":0,"inputs":[{"name":"model","type":"MODEL","link":142,"slot_index":0,"label":"model"},{"name":"conditioning","type":"CONDITIONING","link":87,"slot_index":1,"label":"conditioning"}],"outputs":[{"name":"GUIDER","type":"GUIDER","links":[30],"slot_index":0,"shape":3,"label":"GUIDER"}],"properties":{"Node name for S&R":"BasicGuider"},"widgets_values":[]},{"id":25,"type":"RandomNoise","pos":{"0":1150,"1":283},"size":{"0":290,"1":82},"flags":{},"order":5,"mode":0,"inputs":[],"outputs":[{"name":"NOISE","type":"NOISE","links":[37],"shape":3,"label":"NOISE"}],"properties":{"Node name for S&R":"RandomNoise"},"widgets_values":[600266449082634,"randomize"]},{"id":13,"type":"SamplerCustomAdvanced","pos":{"0":1453,"1":425},"size":{"0":301.4008483886719,"1":345.38330078125},"flags":{},"order":22,"mode":0,"inputs":[{"name":"noise","type":"NOISE","link":37,"slot_index":0,"label":"noise"},{"name":"guider","type":"GUIDER","link":30,"slot_index":1,"label":"guider"},{"name":"sampler","type":"SAMPLER","link":19,"slot_index":2,"label":"sampler"},{"name":"sigmas","type":"SIGMAS","link":20,"slot_index":3,"label":"sigmas"},{"name":"latent_image","type":"LATENT","link":140,"slot_index":4,"label":"latent_image"}],"outputs":[{"name":"output","type":"LATENT","links":[24],"slot_index":0,"shape":3,"label":"output"},{"name":"denoised_output","type":"LATENT","links":null,"shape":3,"label":"denoised_output"}],"properties":{"Node name for S&R":"SamplerCustomAdvanced"},"widgets_values":[]},{"id":8,"type":"VAEDecode","pos":{"0":1492,"1":281},"size":{"0":140,"1":50},"flags":{},"order":23,"mode":0,"inputs":[{"name":"samples","type":"LATENT","link":24,"label":"samples"},{"name":"vae","type":"VAE","link":12,"label":"vae"}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[9],"slot_index":0,"label":"IMAGE"}],"properties":{"Node name for S&R":"VAEDecode"},"widgets_values":[]},{"id":121,"type":"Note","pos":{"0":395.9151916503906,"1":136.14727783203125},"size":{"0":737.174072265625,"1":171.19815063476562},"flags":{},"order":6,"mode":0,"inputs":[],"outputs":[],"properties":{},"widgets_values":["Commands if you run into PuLID Node Errors:\n\n\nE:\\pinokio\\api\\comfyui.git\\app\\env\\Scripts\\python.exe -m pip install facexlib onnxruntime\n\n\nE:\\pinokio\\api\\comfyui.git\\app\\env\\Scripts\\python.exe -m pip install E:\\pinokio\\api\\comfyui.git\\app\\insightface-0.7.3-cp310-cp310-win_amd64.whl onnxruntime\n\nE:\\pinokio\\api\\comfyui.git\\app\\env\\Scripts\\python.exe -s -m pip install -r requirements.txt onnxruntime"],"color":"#432","bgcolor":"#653"},{"id":114,"type":"LoadImage","pos":{"0":27,"1":882},"size":{"0":320,"1":338},"flags":{},"order":7,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[147,155],"slot_index":0,"label":"IMAGE"},{"name":"MASK","type":"MASK","links":null,"label":"MASK"}],"title":"Load Image Prompt","properties":{"Node name for S&R":"LoadImage"},"widgets_values":["ComfyUI_00019_.png","image"]},{"id":93,"type":"LoadImage","pos":{"0":1160,"1":872},"size":{"0":320,"1":338},"flags":{},"order":8,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[136],"slot_index":0,"label":"IMAGE"},{"name":"MASK","type":"MASK","links":null,"label":"MASK"}],"title":"Load Image Face","properties":{"Node name for S&R":"LoadImage"},"widgets_values":["ma01-59.jpg","image"]},{"id":112,"type":"UNETLoader","pos":{"0":391,"1":357},"size":{"0":315,"1":82},"flags":{},"order":9,"mode":0,"inputs":[],"outputs":[{"name":"MODEL","type":"MODEL","links":[145],"slot_index":0}],"properties":{"Node name for S&R":"UNETLoader"},"widgets_values":["flux1-dev.safetensors","default"]},{"id":115,"type":"DownloadAndLoadFlorence2Model","pos":{"0":28,"1":303},"size":{"0":331.9540710449219,"1":106},"flags":{},"order":10,"mode":0,"inputs":[{"name":"lora","type":"PEFTLORA","link":null,"shape":7}],"outputs":[{"name":"florence2_model","type":"FL2MODEL","links":[146],"slot_index":0}],"properties":{"Node name for S&R":"DownloadAndLoadFlorence2Model"},"widgets_values":["microsoft/Florence-2-base","fp16","sdpa"]},{"id":99,"type":"PulidFluxModelLoader","pos":{"0":386,"1":1069},"size":{"0":320,"1":60},"flags":{},"order":11,"mode":0,"inputs":[],"outputs":[{"name":"PULIDFLUX","type":"PULIDFLUX","links":[133],"slot_index":0,"label":"PULIDFLUX"}],"properties":{"Node name for S&R":"PulidFluxModelLoader"},"widgets_values":["pulid_flux_v0.9.0.safetensors"]}],"links":[[9,8,0,9,0,"IMAGE"],[12,10,0,8,1,"VAE"],[19,16,0,13,2,"SAMPLER"],[20,17,0,13,3,"SIGMAS"],[24,13,0,8,0,"LATENT"],[30,22,0,13,1,"GUIDER"],[37,25,0,13,0,"NOISE"],[86,6,0,60,0,"CONDITIONING"],[87,60,0,22,1,"CONDITIONING"],[108,11,0,6,0,"CLIP"],[133,99,0,100,1,"PULIDFLUX"],[135,97,0,100,3,"FACEANALYSIS"],[136,93,0,100,4,"IMAGE"],[137,101,0,100,2,"EVA_CLIP"],[140,102,0,13,4,"LATENT"],[141,100,0,17,0,"MODEL"],[142,100,0,22,0,"MODEL"],[145,112,0,100,0,"MODEL"],[146,115,0,113,1,"FL2MODEL"],[147,114,0,113,0,"IMAGE"],[148,116,0,6,1,"STRING"],[149,113,2,116,0,"STRING"],[150,117,1,102,1,"INT"],[151,117,2,102,0,"INT"],[155,114,0,120,0,"IMAGE"],[156,120,0,117,0,"IMAGE"]],"groups":[],"config":{},"extra":{"ds":{"scale":0.8264462809917362,"offset":[-609.5334481899447,-206.95446139871885]},"workspace_info":{"id":"oYzKNswvlVDYwnwN0203Z"}},"version":0.4}

Additional Context

I am using the workflow and tutorial from AI Motion Studio
https://www.youtube.com/watch?v=xUduNl7-pE0

ComfyUI v0.2.7-22-g122c9ca
ComfuyUI_frontend v1.3.26
System Info
OS
nt
Python Version
3.12.7 (tags/v3.12.7:0b05ead, Oct 1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]
Embedded Python
true
Pytorch Version
2.5.1+cu124
Arguments
ComfyUI\main.py --windows-standalone-build
RAM Total
63.62 GB
RAM Free
38.75 GB
Devices
Name
cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
Type
cuda
VRAM Total
23.99 GB

@af-74413592
Copy link

1.roll back comfyui or
2.edit pulidflux.py line78 function forward_orig() add a param transformer_options={}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants