Skip to content

TypeError: cannot unpack non-iterable NoneType object #37

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
LuizRemy opened this issue Jan 13, 2025 · 38 comments
Open

TypeError: cannot unpack non-iterable NoneType object #37

LuizRemy opened this issue Jan 13, 2025 · 38 comments

Comments

@LuizRemy
Copy link

I just reinstall my comfyui trying to use omnigen custom node but it fails
I'm newbie in python;codes

helps pls I need to fix it soon as possible


ailab_OmniGen

cannot unpack non-iterable NoneType object

got prompt
Downloading OmniGen code from GitHub...
Downloaded model.py
Downloaded pipeline.py
Downloaded processor.py
Downloaded scheduler.py
Downloaded transformer.py
Downloaded utils.py
Downloaded init.py
OmniGen code setup completed
OmniGen models verified successfully
Current model instance: None
Current model precision: None
Loading safetensors
Warning: Pipeline.to(device) returned None, using original pipeline
VRAM usage after pipeline creation: 15102.28MB
Processing with prompt: Transform <|image_1|> into an oil painting, giving it a textured, classic style with visible brushstrokes and rich color.
Model will be kept during generation
0%| | 0/25 [00:00<?, ?it/s]
Error during generation: cannot unpack non-iterable NoneType object
!!! Exception during processing !!! cannot unpack non-iterable NoneType object
Traceback (most recent call last):
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-OmniGen\ailab_OmniGen.py", line 387, in generation
raise e
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-OmniGen\ailab_OmniGen.py", line 353, in generation
output = pipe(
^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\pipeline.py", line 286, in call
samples = scheduler(latents, func, model_kwargs, use_kv_cache=use_kv_cache, offload_kv_cache=offload_kv_cache)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\scheduler.py", line 164, in call
pred, cache = func(z, timesteps, past_key_values=cache, **model_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\model.py", line 388, in forward_with_separate_cfg
temp_out, temp_pask_key_values = self.forward(x[i], timestep[i], input_ids[i], input_img_latents[i], input_image_sizes[i], attention_mask[i], position_ids[i], past_key_values=past_key_values[i], return_past_key_values=True, offload_model=offload_model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\model.py", line 338, in forward
output = self.llm(inputs_embeds=input_emb, attention_mask=attention_mask, position_ids=position_ids, past_key_values=past_key_values, offload_model=offload_model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\transformer.py", line 157, in forward
layer_outputs = decoder_layer(
^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 295, in forward
hidden_states, self_attn_weights = self.self_attn(
^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 189, in forward
cos, sin = position_embeddings
^^^^^^^^
TypeError: cannot unpack non-iterable NoneType object

Prompt executed in 304.61 seconds
^^^^^^^^
TypeError: cannot unpack non-iterable NoneType object

Prompt executed in 304.61 seconds


@LuizRemy
Copy link
Author

LuizRemy commented Jan 13, 2025

I was able to make Omnigen run by virtual env as Shatoi say in his post

but never make it working with comfyui

@sunhaha123
Copy link

same problem!

@1038lab
Copy link
Owner

1038lab commented Jan 23, 2025

updated, should fixed your issue.

@tvmrm
Copy link

tvmrm commented Jan 26, 2025

got the same error after py updates:

ComfyUI Error Report

Error Details

  • Node ID: 6
  • Node Type: ailab_OmniGen
  • Exception Type: TypeError
  • Exception Message: cannot unpack non-iterable NoneType object

Stack Trace

  File "C:\ComfyUI\ComfyUI\execution.py", line 327, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI\execution.py", line 202, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI\execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)

  File "C:\ComfyUI\ComfyUI\execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OmniGen\AILab_OmniGen.py", line 387, in generation
    raise e

  File "C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OmniGen\AILab_OmniGen.py", line 353, in generation
    output = pipe(
             ^^^^^

  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\pipeline.py", line 286, in __call__
    samples = scheduler(latents, func, model_kwargs, use_kv_cache=use_kv_cache, offload_kv_cache=offload_kv_cache)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\scheduler.py", line 164, in __call__
    pred, cache = func(z, timesteps, past_key_values=cache, **model_kwargs)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\model.py", line 388, in forward_with_separate_cfg
    temp_out, temp_pask_key_values = self.forward(x[i], timestep[i], input_ids[i], input_img_latents[i], input_image_sizes[i], attention_mask[i], position_ids[i], past_key_values=past_key_values[i], return_past_key_values=True, offload_model=offload_model)
                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\model.py", line 338, in forward
    output = self.llm(inputs_embeds=input_emb, attention_mask=attention_mask, position_ids=position_ids, past_key_values=past_key_values, offload_model=offload_model)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\transformer.py", line 157, in forward
    layer_outputs = decoder_layer(
                    ^^^^^^^^^^^^^^

  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\python_embeded\Lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 295, in forward
    hidden_states, self_attn_weights = self.self_attn(
                                       ^^^^^^^^^^^^^^^

  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\ComfyUI\python_embeded\Lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 189, in forward
    cos, sin = position_embeddings
    ^^^^^^^^

System Information

  • ComfyUI Version: 0.3.12
  • Arguments: ComfyUI\main.py --windows-standalone-build
  • OS: nt
  • Python Version: 3.12.7 (tags/v3.12.7:0b05ead, Oct 1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]
  • Embedded Python: true
  • PyTorch Version: 2.5.1+cu124

Devices

  • Name: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
    • Type: cuda
    • VRAM Total: 25756696576
    • VRAM Free: 14185760092
    • Torch VRAM Total: 16072572928
    • Torch VRAM Free: 6243845468

Logs

2025-01-25T19:51:34.904281 - 3.12.7 (tags/v3.12.7:0b05ead, Oct  1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]2025-01-25T19:51:34.904281 - 
2025-01-25T19:51:34.904281 - ** Python executable:2025-01-25T19:51:34.904281 -  2025-01-25T19:51:34.904281 - C:\ComfyUI\python_embeded\python.exe2025-01-25T19:51:34.904281 - 
2025-01-25T19:51:34.904281 - ** ComfyUI Path:2025-01-25T19:51:34.904281 -  2025-01-25T19:51:34.904281 - C:\ComfyUI\ComfyUI2025-01-25T19:51:34.904281 - 
2025-01-25T19:51:34.904281 - ** User directory:2025-01-25T19:51:34.904281 -  2025-01-25T19:51:34.904281 - C:\ComfyUI\ComfyUI\user2025-01-25T19:51:34.904281 - 
2025-01-25T19:51:34.904281 - ** ComfyUI-Manager config path:2025-01-25T19:51:34.904281 -  2025-01-25T19:51:34.904281 - C:\ComfyUI\ComfyUI\user\default\ComfyUI-Manager\config.ini2025-01-25T19:51:34.904281 - 
2025-01-25T19:51:34.904281 - ** Log path:2025-01-25T19:51:34.904281 -  2025-01-25T19:51:34.904281 - C:\ComfyUI\ComfyUI\user\comfyui.log2025-01-25T19:51:34.904281 - 
2025-01-25T19:51:35.733086 - �[34m[MaraScott] �[92mInitialization�[0m2025-01-25T19:51:35.733086 - 
2025-01-25T19:51:35.733086 - 
Prestartup times for custom nodes:
2025-01-25T19:51:35.733086 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\rgthree-comfy
2025-01-25T19:51:35.733086 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Easy-Use
2025-01-25T19:51:35.733086 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_MaraScott_Nodes
2025-01-25T19:51:35.733086 -    2.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Manager
2025-01-25T19:51:35.733086 - 
2025-01-25T19:51:38.210702 - Checkpoint files will always be loaded safely.
2025-01-25T19:51:38.383370 - Total VRAM 24564 MB, total RAM 65254 MB
2025-01-25T19:51:38.383370 - pytorch version: 2.5.1+cu124
2025-01-25T19:51:38.383370 - Set vram state to: NORMAL_VRAM
2025-01-25T19:51:38.383370 - Device: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
2025-01-25T19:51:39.270581 - Using pytorch attention
2025-01-25T19:51:40.703922 - ComfyUI version: 0.3.12
2025-01-25T19:51:40.733016 - [Prompt Server] web root: C:\ComfyUI\ComfyUI\web
2025-01-25T19:51:42.125130 - Adding2025-01-25T19:51:42.125130 -  2025-01-25T19:51:42.125130 - C:\ComfyUI\ComfyUI\custom_nodes2025-01-25T19:51:42.125130 -  2025-01-25T19:51:42.125130 - to sys.path2025-01-25T19:51:42.125130 - 
2025-01-25T19:51:42.187900 - Could not find efficiency nodes2025-01-25T19:51:42.187900 - 
2025-01-25T19:51:42.187900 - Could not find ControlNetPreprocessors nodes2025-01-25T19:51:42.187900 - 
2025-01-25T19:51:42.187900 - Could not find AdvancedControlNet nodes2025-01-25T19:51:42.187900 - 
2025-01-25T19:51:42.187900 - Could not find AnimateDiff nodes2025-01-25T19:51:42.187900 - 
2025-01-25T19:51:42.187900 - Could not find IPAdapter nodes2025-01-25T19:51:42.187900 - 
2025-01-25T19:51:42.204024 - Could not find VideoHelperSuite nodes2025-01-25T19:51:42.204024 - 
2025-01-25T19:51:42.208129 - ### Loading: ComfyUI-Impact-Pack (V8.4.1)2025-01-25T19:51:42.208129 - 
2025-01-25T19:51:42.251787 - ### Loading: ComfyUI-Impact-Pack (V8.4.1)2025-01-25T19:51:42.251787 - 
2025-01-25T19:51:42.251787 - Loaded ImpactPack nodes from2025-01-25T19:51:42.251787 - [Impact Pack] Wildcards loading done.2025-01-25T19:51:42.251787 -  2025-01-25T19:51:42.251787 - 
2025-01-25T19:51:42.251787 - C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack2025-01-25T19:51:42.251787 - 
2025-01-25T19:51:42.251787 - [Impact Pack] Wildcards loading done.2025-01-25T19:51:42.251787 - 
2025-01-25T19:51:42.355064 - [Crystools �[0;32mINFO�[0m] Crystools version: 1.21.0
2025-01-25T19:51:42.376648 - [Crystools �[0;32mINFO�[0m] CPU: Intel(R) Core(TM) i9-14900KF - Arch: AMD64 - OS: Windows 11
2025-01-25T19:51:42.388256 - [Crystools �[0;32mINFO�[0m] Pynvml (Nvidia) initialized.
2025-01-25T19:51:42.388256 - [Crystools �[0;32mINFO�[0m] GPU/s:
2025-01-25T19:51:42.394320 - [Crystools �[0;32mINFO�[0m] 0) NVIDIA GeForce RTX 4090
2025-01-25T19:51:42.394320 - [Crystools �[0;32mINFO�[0m] NVIDIA Driver: 560.94
2025-01-25T19:51:42.867843 - �[34m[ComfyUI-Easy-Use] server: �[0mv1.2.7 �[92mLoaded�[0m2025-01-25T19:51:42.867843 - 
2025-01-25T19:51:42.867843 - �[34m[ComfyUI-Easy-Use] web root: �[0mC:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Easy-Use\web_version/v2 �[92mLoaded�[0m2025-01-25T19:51:42.867843 - 
2025-01-25T19:51:42.883599 - [ComfyUI-HakuImg]-|19:51:42|-�[0;32mINFO�[0m: Check HakuImg requirements
2025-01-25T19:51:42.891524 - [ComfyUI-HakuImg]-|19:51:42|-�[0;32mINFO�[0m: Check HakuImg requirements done
2025-01-25T19:51:42.891524 - [ComfyUI-HakuImg]-|19:51:42|-�[0;32mINFO�[0m: Check HakuImg submodule module: PixelOE
2025-01-25T19:51:42.895687 - [ComfyUI-HakuImg]-|19:51:42|-�[0;32mINFO�[0m: Check HakuImg submodule module done
2025-01-25T19:51:42.915252 - ### Loading: ComfyUI-Impact-Pack (V8.4.1)2025-01-25T19:51:42.915252 - 
2025-01-25T19:51:42.915252 - [Impact Pack] Wildcards loading done.2025-01-25T19:51:42.915252 - 
2025-01-25T19:51:42.915252 - ### Loading: ComfyUI-Impact-Subpack (V1.2.9)
2025-01-25T19:51:43.272428 - [Impact Subpack] ultralytics_bbox: C:\ComfyUI\ComfyUI\models\ultralytics\bbox
2025-01-25T19:51:43.272428 - [Impact Subpack] ultralytics_segm: C:\ComfyUI\ComfyUI\models\ultralytics\segm
2025-01-25T19:51:43.288546 - ### Loading: ComfyUI-Inspire-Pack (V1.10)2025-01-25T19:51:43.288546 - 
2025-01-25T19:51:43.540076 - Total VRAM 24564 MB, total RAM 65254 MB
2025-01-25T19:51:43.540076 - pytorch version: 2.5.1+cu124
2025-01-25T19:51:43.540076 - Set vram state to: NORMAL_VRAM
2025-01-25T19:51:43.541076 - Device: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
2025-01-25T19:51:43.559563 - ### Loading: ComfyUI-Manager (V3.9.2)
2025-01-25T19:51:43.716173 - ### ComfyUI Version: v0.3.12-24-g67feb052 | Released on '2025-01-25'
2025-01-25T19:51:43.942656 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json
2025-01-25T19:51:43.957965 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json
2025-01-25T19:51:43.965638 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json
2025-01-25T19:51:43.989648 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json
2025-01-25T19:51:44.010507 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json
2025-01-25T19:51:44.407146 - Failed to auto update `Quality of Life Suit` 2025-01-25T19:51:44.407146 - 
2025-01-25T19:51:44.407146 - �[33mQualityOfLifeSuit_Omar92_DIR:�[0m C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-QualityOfLifeSuit_Omar922025-01-25T19:51:44.407146 - 
2025-01-25T19:51:44.407146 - (pysssss:WD14Tagger) [DEBUG] Available ORT providers: TensorrtExecutionProvider, CUDAExecutionProvider, CPUExecutionProvider2025-01-25T19:51:44.407146 - 
2025-01-25T19:51:44.407146 - (pysssss:WD14Tagger) [DEBUG] Using ORT providers: CUDAExecutionProvider, CPUExecutionProvider2025-01-25T19:51:44.407146 - 
2025-01-25T19:51:44.438722 - ------------------------------------------2025-01-25T19:51:44.438722 - 
2025-01-25T19:51:44.438722 - �[34mComfyroll Studio v1.76 : �[92m 175 Nodes Loaded�[0m2025-01-25T19:51:44.438722 - 
2025-01-25T19:51:44.438722 - ------------------------------------------2025-01-25T19:51:44.438722 - 
2025-01-25T19:51:44.438722 - ** For changes, please see patch notes at https://github.com/Suzie1/ComfyUI_Comfyroll_CustomNodes/blob/main/Patch_Notes.md2025-01-25T19:51:44.438722 - 
2025-01-25T19:51:44.438722 - ** For help, please see the wiki at https://github.com/Suzie1/ComfyUI_Comfyroll_CustomNodes/wiki2025-01-25T19:51:44.438722 - 
2025-01-25T19:51:44.438722 - ------------------------------------------2025-01-25T19:51:44.438722 - 
2025-01-25T19:51:44.445792 - �[36;20m[ComfyUI_ControlNet_AUX] | INFO -> Using ckpts path: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_ControlNet_AUX\ckpts�[0m
2025-01-25T19:51:44.445792 - �[36;20m[ComfyUI_ControlNet_AUX] | INFO -> Using symlinks: False�[0m
2025-01-25T19:51:44.445792 - �[36;20m[ComfyUI_ControlNet_AUX] | INFO -> Using ort providers: ['CUDAExecutionProvider', 'DirectMLExecutionProvider', 'OpenVINOExecutionProvider', 'ROCMExecutionProvider', 'CPUExecutionProvider', 'CoreMLExecutionProvider']�[0m
2025-01-25T19:51:44.467988 - DWPose: Onnxruntime with acceleration providers detected2025-01-25T19:51:44.467988 - 
2025-01-25T19:51:44.477959 - �[1;35m### [START] ComfyUI AlekPet Nodes �[1;34mv1.0.45�[0m�[1;35m ###�[0m2025-01-25T19:51:44.477959 - 
2025-01-25T19:51:44.972284 - �[92mNode -> ChatGLMNode: �[93mChatGLM4TranslateCLIPTextEncodeNode, ChatGLM4TranslateTextNode, ChatGLM4InstructNode, ChatGLM4InstructMediaNode�[0m �[92m[Loading] �[0m2025-01-25T19:51:44.972284 - 
2025-01-25T19:51:44.979288 - �[92mNode -> ArgosTranslateNode: �[93mArgosTranslateCLIPTextEncodeNode, ArgosTranslateTextNode�[0m �[92m[Loading] �[0m2025-01-25T19:51:44.979288 - �[92mNode -> DeepTranslatorNode: �[93mDeepTranslatorCLIPTextEncodeNode, DeepTranslatorTextNode�[0m �[92m[Loading] �[0m2025-01-25T19:51:44.979288 - 
2025-01-25T19:51:44.980288 - 
2025-01-25T19:51:44.980288 - �[92mNode -> GoogleTranslateNode: �[93mGoogleTranslateCLIPTextEncodeNode, GoogleTranslateTextNode�[0m �[92m[Loading] �[0m2025-01-25T19:51:44.980288 - 
2025-01-25T19:51:44.981288 - �[92mNode -> PoseNode: �[93mPoseNode�[0m �[92m[Loading] �[0m2025-01-25T19:51:44.981288 - 
2025-01-25T19:51:44.983288 - �[92mNode -> ExtrasNode: �[93mPreviewTextNode, HexToHueNode, ColorsCorrectNode�[0m �[92m[Loading] �[0m2025-01-25T19:51:44.983288 - 
2025-01-25T19:51:45.017834 - �[92mNode -> IDENode: �[93mIDENode�[0m �[92m[Loading] �[0m2025-01-25T19:51:45.017834 - 
2025-01-25T19:51:45.152629 - �[92mNode -> PainterNode: �[93mPainterNode�[0m �[92m[Loading] �[0m2025-01-25T19:51:45.152629 - 
2025-01-25T19:51:45.153629 - �[1;35m### [END] ComfyUI AlekPet Nodes ###�[0m2025-01-25T19:51:45.153629 - 
2025-01-25T19:51:45.489487 - LOADED 6 FONTS2025-01-25T19:51:45.489487 - 
2025-01-25T19:51:45.489487 - No OpenGL_accelerate module loaded: No module named 'OpenGL_accelerate'
2025-01-25T19:51:46.257318 - 

███╗   ███╗ █████╗  ██████╗██╗  ██╗██╗███╗   ██╗███████╗               
████╗ ████║██╔══██╗██╔════╝██║  ██║██║████╗  ██║██╔════╝               
██╔████╔██║███████║██║     ███████║██║██╔██╗ ██║█████╗                 
██║╚██╔╝██║██╔══██║██║     ██╔══██║██║██║╚██╗██║██╔══╝                 
██║ ╚═╝ ██║██║  ██║╚██████╗██║  ██║██║██║ ╚████║███████╗               
╚═╝     ╚═╝╚═╝  ╚═╝ ╚═════╝╚═╝  ╚═╝╚═╝╚═╝  ╚═══╝╚══════╝               
                                                                       
██████╗ ███████╗██╗     ██╗   ██╗███████╗██╗ ██████╗ ███╗   ██╗███████╗
██╔══██╗██╔════╝██║     ██║   ██║██╔════╝██║██╔═══██╗████╗  ██║██╔════╝
██║  ██║█████╗  ██║     ██║   ██║███████╗██║██║   ██║██╔██╗ ██║███████╗
██║  ██║██╔══╝  ██║     ██║   ██║╚════██║██║██║   ██║██║╚██╗██║╚════██║
██████╔╝███████╗███████╗╚██████╔╝███████║██║╚██████╔╝██║ ╚████║███████║
╚═════╝ ╚══════╝╚══════╝ ╚═════╝ ╚══════╝╚═╝ ╚═════╝ ╚═╝  ╚═══╝╚══════╝
                                                                       

2025-01-25T19:51:46.257318 - 
2025-01-25T19:51:46.261689 - --------------2025-01-25T19:51:46.261689 - 
2025-01-25T19:51:46.261689 - *ComfyUI_Jags_VectorMagic- nodes_loaded*2025-01-25T19:51:46.261689 - 
2025-01-25T19:51:46.261689 - --------------2025-01-25T19:51:46.261689 - 
2025-01-25T19:51:46.449244 - �[34m[MaraScott] �[92mLoaded�[0m2025-01-25T19:51:46.450245 - 
2025-01-25T19:51:46.603290 - # 😺 ComfyUI_OmniGen_Wrapper: -> �[1;32mSuccess loaded 1 nodes.�[m2025-01-25T19:51:46.614735 - 
2025-01-25T19:51:46.661377 - �[36;20m[comfy_mtb] | INFO -> loaded �[96m91�[0m nodes successfuly�[0m
2025-01-25T19:51:46.662379 - �[36;20m[comfy_mtb] | INFO -> Some nodes (5) could not be loaded. This can be ignored, but go to http://127.0.0.1:8188/mtb if you want more information.�[0m
2025-01-25T19:51:46.665381 - 
�[32mInitializing ControlAltAI Nodes�[0m2025-01-25T19:51:46.665381 - 
2025-01-25T19:51:46.676986 - Plush - Running on python installation: C:\ComfyUI\python_embeded\python.exe, ver: 3.12.7 (tags/v3.12.7:0b05ead, Oct  1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]2025-01-25T19:51:46.676986 - 
2025-01-25T19:51:46.676986 - Plush - Current Openai Version: 2025-01-25T19:51:46.676986 -  2025-01-25T19:51:46.676986 - 1.55.32025-01-25T19:51:46.676986 - 
2025-01-25T19:51:46.686645 - Plush - Version:2025-01-25T19:51:46.686645 -  2025-01-25T19:51:46.686645 - 1.21.222025-01-25T19:51:46.686645 - 
2025-01-25T19:51:46.788857 - 
2025-01-25T19:51:46.788857 - �[92m[rgthree-comfy] Loaded 42 extraordinary nodes. 🎉�[00m2025-01-25T19:51:46.789858 - 
2025-01-25T19:51:46.789858 - 
2025-01-25T19:51:46.824453 - 
Import times for custom nodes:
2025-01-25T19:51:46.824453 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\LAizypainter-Exporter-ComfyUI
2025-01-25T19:51:46.824453 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\websocket_image_save.py
2025-01-25T19:51:46.824453 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\UtilNodes-ComfyUI-main
2025-01-25T19:51:46.825452 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-SDXL-EmptyLatentImage
2025-01-25T19:51:46.825452 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\comfyui-seamless-tiling
2025-01-25T19:51:46.825452 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\cg-use-everywhere
2025-01-25T19:51:46.825452 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Eagleshadow
2025-01-25T19:51:46.825452 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\masquerade-nodes-comfyui
2025-01-25T19:51:46.825452 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\comfyui-inpaint-cropandstitch
2025-01-25T19:51:46.825452 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Miaoshouai-Tagger
2025-01-25T19:51:46.825452 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\Comfy-Photoshop-SD
2025-01-25T19:51:46.825452 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-post-processing-nodes
2025-01-25T19:51:46.825452 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-mxToolkit
2025-01-25T19:51:46.825452 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-WD14-Tagger
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\OmniGen-ComfyUI
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\CRT-Nodes
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-GGUF
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-RAVE_ATTN
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-QualityOfLifeSuit_Omar92
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\comfyui_jags_vectormagic
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OmniGen
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\comfyui-various
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_essentials
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_UltimateSDUpscale
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Florence2
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Chibi-Nodes
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ControlAltAI-Nodes
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\comfy-image-saver
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\x-flux-comfyui-main
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\comfyui-dream-video-batches
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Custom-Scripts
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\x-flux-comfyui
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\rgthree-comfy
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\virtuoso-nodes
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Comfyroll_CustomNodes
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-KJNodes
2025-01-25T19:51:46.826451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\aegisflow_utility_nodes
2025-01-25T19:51:46.827451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Inspire-Pack
2025-01-25T19:51:46.827451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-HakuImg
2025-01-25T19:51:46.827451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_ControlNet_AUX
2025-01-25T19:51:46.827451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\comfy_mtb
2025-01-25T19:51:46.827451 -    0.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Crystools
2025-01-25T19:51:46.827451 -    0.1 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_LayerStyle
2025-01-25T19:51:46.827451 -    0.1 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_MaraScott_Nodes
2025-01-25T19:51:46.827451 -    0.1 seconds: C:\ComfyUI\ComfyUI\custom_nodes\Plush-for-ComfyUI
2025-01-25T19:51:46.827451 -    0.2 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_OmniGen_Wrapper
2025-01-25T19:51:46.827451 -    0.2 seconds: C:\ComfyUI\ComfyUI\custom_nodes\comfyui-propost
2025-01-25T19:51:46.827451 -    0.2 seconds: C:\ComfyUI\ComfyUI\custom_nodes\comfyui-art-venture
2025-01-25T19:51:46.827451 -    0.2 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-iTools
2025-01-25T19:51:46.827451 -    0.3 seconds: C:\ComfyUI\ComfyUI\custom_nodes\comfyui-ollama
2025-01-25T19:51:46.827451 -    0.4 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Manager
2025-01-25T19:51:46.827451 -    0.4 seconds: C:\ComfyUI\ComfyUI\custom_nodes\comfyui-impact-subpack
2025-01-25T19:51:46.827451 -    0.5 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Easy-Use
2025-01-25T19:51:46.827451 -    0.8 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Fill-Nodes
2025-01-25T19:51:46.827451 -    0.9 seconds: C:\ComfyUI\ComfyUI\custom_nodes\clipseg.py
2025-01-25T19:51:46.827451 -    1.0 seconds: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_Custom_Nodes_AlekPet
2025-01-25T19:51:46.827451 - 
2025-01-25T19:51:46.834420 - Starting server

2025-01-25T19:51:46.834420 - To see the GUI go to: http://127.0.0.1:8188
2025-01-25T19:51:47.490050 - FETCH DATA from: C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Manager\extension-node-map.json2025-01-25T19:51:47.490050 - 2025-01-25T19:51:47.493932 -  [DONE]2025-01-25T19:51:47.493932 - 
2025-01-25T19:51:48.161481 - [Inspire Pack] IPAdapterPlus is not installed.2025-01-25T19:51:48.161481 - 
2025-01-25T19:51:48.208598 - �[33mQualityOfLifeSuit_Omar92:�[0m:NSP ready2025-01-25T19:51:48.208598 - 
2025-01-25T19:51:48.631962 - FETCH ComfyRegistry Data: 5/312025-01-25T19:51:48.631962 - 
2025-01-25T19:51:52.399717 - ['lora.safetensors', 'realism_lora.safetensors']2025-01-25T19:51:52.399717 - 
2025-01-25T19:51:52.399717 - ['lora.safetensors', 'realism_lora.safetensors']2025-01-25T19:51:52.399717 - 
2025-01-25T19:51:54.399019 - FETCH ComfyRegistry Data: 10/312025-01-25T19:51:54.399019 - 
2025-01-25T19:51:59.755540 - FETCH ComfyRegistry Data: 15/312025-01-25T19:51:59.755540 - 
2025-01-25T19:52:05.149143 - FETCH ComfyRegistry Data: 20/312025-01-25T19:52:05.149143 - 
2025-01-25T19:52:10.864240 - FETCH ComfyRegistry Data: 25/312025-01-25T19:52:10.864240 - 
2025-01-25T19:52:16.835987 - FETCH ComfyRegistry Data: 30/312025-01-25T19:52:16.835987 - 
2025-01-25T19:52:18.581586 - FETCH ComfyRegistry Data [DONE]2025-01-25T19:52:18.581586 - 
2025-01-25T19:52:18.627437 - [ComfyUI-Manager] default cache updated: https://api.comfy.org/nodes
2025-01-25T19:52:18.665815 - nightly_channel: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/remote
2025-01-25T19:52:18.665815 - FETCH DATA from: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json2025-01-25T19:52:18.665815 - 2025-01-25T19:52:18.708638 -  [DONE]2025-01-25T19:52:18.708638 - 
2025-01-25T19:52:38.402052 - got prompt
2025-01-25T19:52:38.475153 - OmniGen code already exists2025-01-25T19:52:38.475153 - 
2025-01-25T19:52:38.475153 - OmniGen models verified successfully2025-01-25T19:52:38.475153 - 
2025-01-25T19:52:38.475153 - Current model instance: None2025-01-25T19:52:38.475153 - 
2025-01-25T19:52:38.475153 - Current model precision: None2025-01-25T19:52:38.475153 - 
2025-01-25T19:53:20.238917 - Loading safetensors2025-01-25T19:53:20.238917 - 
2025-01-25T19:53:28.317474 - Warning: Pipeline.to(device) returned None, using original pipeline2025-01-25T19:53:28.317474 - 
2025-01-25T19:53:28.317474 - VRAM usage after pipeline creation: 15102.28MB2025-01-25T19:53:28.317474 - 
2025-01-25T19:53:28.521460 - Processing with prompt: <img><|image_1|></img> replace the girl with <img><|image_2|></img> girl standing in light casual sweater holding a white coffee mug smilinng at the viewer2025-01-25T19:53:28.521460 - 
2025-01-25T19:53:28.521460 - Model will be kept during generation2025-01-25T19:53:28.521460 - 
2025-01-25T19:53:29.740432 - 
  0%|                                                                                           | 0/50 [00:00<?, ?it/s]2025-01-25T19:53:29.927136 - 
  0%|                                                                                           | 0/50 [00:00<?, ?it/s]2025-01-25T19:53:29.927815 - 
2025-01-25T19:53:29.927815 - Error during generation: cannot unpack non-iterable NoneType object2025-01-25T19:53:29.927815 - 
2025-01-25T19:53:29.927815 - !!! Exception during processing !!! cannot unpack non-iterable NoneType object
2025-01-25T19:53:29.927815 - Traceback (most recent call last):
  File "C:\ComfyUI\ComfyUI\execution.py", line 327, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI\execution.py", line 202, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI\execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)
  File "C:\ComfyUI\ComfyUI\execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OmniGen\AILab_OmniGen.py", line 387, in generation
    raise e
  File "C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OmniGen\AILab_OmniGen.py", line 353, in generation
    output = pipe(
             ^^^^^
  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\pipeline.py", line 286, in __call__
    samples = scheduler(latents, func, model_kwargs, use_kv_cache=use_kv_cache, offload_kv_cache=offload_kv_cache)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\scheduler.py", line 164, in __call__
    pred, cache = func(z, timesteps, past_key_values=cache, **model_kwargs)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\model.py", line 388, in forward_with_separate_cfg
    temp_out, temp_pask_key_values = self.forward(x[i], timestep[i], input_ids[i], input_img_latents[i], input_image_sizes[i], attention_mask[i], position_ids[i], past_key_values=past_key_values[i], return_past_key_values=True, offload_model=offload_model)
                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\model.py", line 338, in forward
    output = self.llm(inputs_embeds=input_emb, attention_mask=attention_mask, position_ids=position_ids, past_key_values=past_key_values, offload_model=offload_model)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\transformer.py", line 157, in forward
    layer_outputs = decoder_layer(
                    ^^^^^^^^^^^^^^
  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\python_embeded\Lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 295, in forward
    hidden_states, self_attn_weights = self.self_attn(
                                       ^^^^^^^^^^^^^^^
  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ComfyUI\python_embeded\Lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 189, in forward
    cos, sin = position_embeddings
    ^^^^^^^^
TypeError: cannot unpack non-iterable NoneType object

2025-01-25T19:53:29.943106 - Prompt executed in 51.53 seconds

Attached Workflow

Please make sure that workflow does not contain any sensitive information such as API keys or passwords.

{"last_node_id":12,"last_link_id":15,"nodes":[{"id":6,"type":"ailab_OmniGen","pos":[40,20],"size":[410,580],"flags":{},"order":2,"mode":0,"inputs":[{"name":"image_1","type":"IMAGE","link":15,"shape":7},{"name":"image_2","type":"IMAGE","link":14,"shape":7},{"name":"image_3","type":"IMAGE","link":null,"shape":7}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[6],"slot_index":0}],"properties":{"Node name for S&R":"ailab_OmniGen"},"widgets_values":["None","image_1 replace the girl with image_2 girl standing in light casual sweater holding a white coffee mug smilinng at the viewer","FP16","Speed Priority",3.5,1.04,50,true,false,1024,1024,181688286971999,"fixed",1024,[false,true]]},{"id":12,"type":"LoadImage","pos":[-390,-60],"size":[420,460],"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[14],"slot_index":0},{"name":"MASK","type":"MASK","links":null}],"properties":{"Node name for S&R":"LoadImage"},"widgets_values":["Paint_43.png","image"]},{"id":8,"type":"LoadImage","pos":[-300,450],"size":[315,314],"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[15],"slot_index":0},{"name":"MASK","type":"MASK","links":null}],"properties":{"Node name for S&R":"LoadImage"},"widgets_values":["Kitchen_girl.jpg","image"]},{"id":3,"type":"PreviewImage","pos":[470,0],"size":[760,750],"flags":{},"order":3,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":6}],"outputs":[],"properties":{"Node name for S&R":"PreviewImage"},"widgets_values":[]}],"links":[[6,6,0,3,0,"IMAGE"],[14,12,0,6,1,"IMAGE"],[15,8,0,6,0,"IMAGE"]],"groups":[],"config":{},"extra":{"ds":{"scale":0.814027493868404,"offset":[1340.0160605406977,547.049157473975]},"node_versions":{"ComfyUI-OmniGen":"1.2.2","comfy-core":"0.3.12"},"ue_links":[]},"version":0.4}

Additional Context

(Please add any additional context or steps to reproduce the error here)

@lingtalfi
Copy link

Same problem (using comfy ui via pinokio)

ComfyUI Error Report

Error Details

  • Node ID: 10
  • Node Type: ailab_OmniGen
  • Exception Type: TypeError
  • Exception Message: cannot unpack non-iterable NoneType object

Stack Trace

  File "D:\tools\ai\pinokio\api\comfy.git\app\execution.py", line 327, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)

  File "D:\tools\ai\pinokio\api\comfy.git\app\execution.py", line 202, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)

  File "D:\tools\ai\pinokio\api\comfy.git\app\execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)

  File "D:\tools\ai\pinokio\api\comfy.git\app\execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))

  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\AILab_OmniGen.py", line 383, in generation
    raise e

  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\AILab_OmniGen.py", line 349, in generation
    output = pipe(

  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)

  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\pipeline.py", line 286, in __call__
    samples = scheduler(latents, func, model_kwargs, use_kv_cache=use_kv_cache, offload_kv_cache=offload_kv_cache)

  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\scheduler.py", line 164, in __call__
    pred, cache = func(z, timesteps, past_key_values=cache, **model_kwargs)

  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)

  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\model.py", line 388, in forward_with_separate_cfg
    temp_out, temp_pask_key_values = self.forward(x[i], timestep[i], input_ids[i], input_img_latents[i], input_image_sizes[i], attention_mask[i], position_ids[i], past_key_values=past_key_values[i], return_past_key_values=True, offload_model=offload_model)

  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\model.py", line 338, in forward
    output = self.llm(inputs_embeds=input_emb, attention_mask=attention_mask, position_ids=position_ids, past_key_values=past_key_values, offload_model=offload_model)

  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)

  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)

  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\transformer.py", line 157, in forward
    layer_outputs = decoder_layer(

  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)

  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)

  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 295, in forward
    hidden_states, self_attn_weights = self.self_attn(

  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)

  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)

  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 189, in forward
    cos, sin = position_embeddings

System Information

  • ComfyUI Version: 0.3.12
  • Arguments: main.py
  • OS: nt
  • Python Version: 3.10.16 | packaged by conda-forge | (main, Dec 5 2024, 14:07:43) [MSC v.1942 64 bit (AMD64)]
  • Embedded Python: false
  • PyTorch Version: 2.5.1+cu121

Devices

  • Name: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
    • Type: cuda
    • VRAM Total: 25756696576
    • VRAM Free: 16899991870
    • Torch VRAM Total: 8254390272
    • Torch VRAM Free: 1139894590

Logs

2025-01-28T00:46:34.684829 - [START] Security scan2025-01-28T00:46:34.684829 - 
2025-01-28T00:46:35.330884 - [DONE] Security scan2025-01-28T00:46:35.330884 - 
2025-01-28T00:46:35.378596 - ## ComfyUI-Manager: installing dependencies done.2025-01-28T00:46:35.378596 - 
2025-01-28T00:46:35.378596 - ** ComfyUI startup time:2025-01-28T00:46:35.378596 -  2025-01-28T00:46:35.378596 - 2025-01-28 00:46:35.3782025-01-28T00:46:35.378596 - 
2025-01-28T00:46:35.378596 - ** Platform:2025-01-28T00:46:35.378596 -  2025-01-28T00:46:35.378596 - Windows2025-01-28T00:46:35.378596 - 
2025-01-28T00:46:35.378596 - ** Python version:2025-01-28T00:46:35.378596 -  2025-01-28T00:46:35.378596 - 3.10.16 | packaged by conda-forge | (main, Dec  5 2024, 14:07:43) [MSC v.1942 64 bit (AMD64)]2025-01-28T00:46:35.378596 - 
2025-01-28T00:46:35.378596 - ** Python executable:2025-01-28T00:46:35.379095 -  2025-01-28T00:46:35.379095 - D:\tools\ai\pinokio\api\comfy.git\app\env\Scripts\python.exe2025-01-28T00:46:35.379095 - 
2025-01-28T00:46:35.379095 - ** ComfyUI Path:2025-01-28T00:46:35.379095 -  2025-01-28T00:46:35.379095 - D:\tools\ai\pinokio\api\comfy.git\app2025-01-28T00:46:35.379095 - 
2025-01-28T00:46:35.379095 - ** ComfyUI Base Folder Path:2025-01-28T00:46:35.379095 -  2025-01-28T00:46:35.379095 - D:\tools\ai\pinokio\api\comfy.git\app2025-01-28T00:46:35.379095 - 
2025-01-28T00:46:35.379095 - ** User directory:2025-01-28T00:46:35.379095 -  2025-01-28T00:46:35.379095 - D:\tools\ai\pinokio\api\comfy.git\app\user2025-01-28T00:46:35.379095 - 
2025-01-28T00:46:35.379095 - ** ComfyUI-Manager config path:2025-01-28T00:46:35.379095 -  2025-01-28T00:46:35.379095 - D:\tools\ai\pinokio\api\comfy.git\app\user\default\ComfyUI-Manager\config.ini2025-01-28T00:46:35.379095 - 
2025-01-28T00:46:35.380097 - ** Log path:2025-01-28T00:46:35.380097 -  2025-01-28T00:46:35.380097 - D:\tools\ai\pinokio\api\comfy.git\app\user\comfyui.log2025-01-28T00:46:35.380097 - 
2025-01-28T00:46:36.315070 - 
Prestartup times for custom nodes:
2025-01-28T00:46:36.315070 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\rgthree-comfy
2025-01-28T00:46:36.315070 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-easy-use
2025-01-28T00:46:36.315070 -    1.9 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\ComfyUI-Manager
2025-01-28T00:46:36.315070 - 
2025-01-28T00:46:37.242769 - Checkpoint files will always be loaded safely.
2025-01-28T00:46:37.361979 - Total VRAM 24564 MB, total RAM 130780 MB
2025-01-28T00:46:37.361979 - pytorch version: 2.5.1+cu121
2025-01-28T00:46:37.362479 - Set vram state to: NORMAL_VRAM
2025-01-28T00:46:37.362979 - Device: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
2025-01-28T00:46:37.942580 - Using pytorch attention
2025-01-28T00:46:38.855211 - ComfyUI version: 0.3.12
2025-01-28T00:46:38.876238 - [Prompt Server] web root: D:\tools\ai\pinokio\api\comfy.git\app\web
2025-01-28T00:46:39.157198 - Adding2025-01-28T00:46:39.157198 -  2025-01-28T00:46:39.157198 - D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes2025-01-28T00:46:39.157198 -  2025-01-28T00:46:39.157198 - to sys.path2025-01-28T00:46:39.157698 - 
2025-01-28T00:46:39.248972 - Could not find efficiency nodes2025-01-28T00:46:39.249979 - 
2025-01-28T00:46:39.271515 - �[36;20m[comfyui_controlnet_aux] | INFO -> Using ckpts path: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui_controlnet_aux\ckpts�[0m
2025-01-28T00:46:39.272019 - �[36;20m[comfyui_controlnet_aux] | INFO -> Using symlinks: False�[0m
2025-01-28T00:46:39.272515 - �[36;20m[comfyui_controlnet_aux] | INFO -> Using ort providers: ['CUDAExecutionProvider', 'DirectMLExecutionProvider', 'OpenVINOExecutionProvider', 'ROCMExecutionProvider', 'CPUExecutionProvider', 'CoreMLExecutionProvider']�[0m
2025-01-28T00:46:39.521791 - DWPose: Onnxruntime with acceleration providers detected2025-01-28T00:46:39.522291 - 
2025-01-28T00:46:39.531298 - Loaded ControlNetPreprocessors nodes from2025-01-28T00:46:39.531298 -  2025-01-28T00:46:39.531298 - D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui_controlnet_aux2025-01-28T00:46:39.531298 - 
2025-01-28T00:46:39.531798 - Could not find AdvancedControlNet nodes2025-01-28T00:46:39.531798 - 
2025-01-28T00:46:39.532798 - Could not find AnimateDiff nodes2025-01-28T00:46:39.532798 - 
2025-01-28T00:46:39.532798 - Could not find IPAdapter nodes2025-01-28T00:46:39.533298 - 
2025-01-28T00:46:39.534798 - Could not find VideoHelperSuite nodes2025-01-28T00:46:39.535298 - 
2025-01-28T00:46:39.535801 - Could not load ImpactPack nodes2025-01-28T00:46:39.535801 -  2025-01-28T00:46:39.535801 - Could not find ImpactPack nodes2025-01-28T00:46:39.535801 - 
2025-01-28T00:46:39.868046 - [Crystools �[0;32mINFO�[0m] Crystools version: 1.21.0
2025-01-28T00:46:39.873606 - [Crystools �[0;32mINFO�[0m] CPU: Intel(R) Core(TM) i9-14900KS - Arch: AMD64 - OS: Windows 10
2025-01-28T00:46:39.879107 - [Crystools �[0;32mINFO�[0m] Pynvml (Nvidia) initialized.
2025-01-28T00:46:39.880108 - [Crystools �[0;32mINFO�[0m] GPU/s:
2025-01-28T00:46:39.885434 - [Crystools �[0;32mINFO�[0m] 0) NVIDIA GeForce RTX 4090
2025-01-28T00:46:39.885434 - [Crystools �[0;32mINFO�[0m] NVIDIA Driver: 551.86
2025-01-28T00:46:39.892973 - 
Depthcrafter Nodes Loaded
2025-01-28T00:46:39.893474 - 
2025-01-28T00:46:40.117061 - �[34m[ComfyUI-Easy-Use] server: �[0mv1.2.7 �[92mLoaded�[0m2025-01-28T00:46:40.117061 - 
2025-01-28T00:46:40.117061 - �[34m[ComfyUI-Easy-Use] web root: �[0mD:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-easy-use\web_version/v2 �[92mLoaded�[0m2025-01-28T00:46:40.117061 - 
2025-01-28T00:46:40.144755 - ### Loading: ComfyUI-Impact-Pack (V8.5.1)2025-01-28T00:46:40.144755 - 
2025-01-28T00:46:40.165798 - [Impact Pack] Wildcards loading done.2025-01-28T00:46:40.166299 - 
2025-01-28T00:46:40.600473 - Total VRAM 24564 MB, total RAM 130780 MB
2025-01-28T00:46:40.600473 - pytorch version: 2.5.1+cu121
2025-01-28T00:46:40.601473 - Set vram state to: NORMAL_VRAM
2025-01-28T00:46:40.601473 - Device: cuda:0 NVIDIA GeForce RTX 4090 : cudaMallocAsync
2025-01-28T00:46:40.624992 - ### Loading: ComfyUI-Manager (V3.9.5)
2025-01-28T00:46:40.684533 - ### ComfyUI Version: v0.3.12-26-g255edf22 | Released on '2025-01-27'
2025-01-28T00:46:40.924833 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json
2025-01-28T00:46:40.933343 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json
2025-01-28T00:46:40.950587 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json
2025-01-28T00:46:40.984137 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json
2025-01-28T00:46:41.033202 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json
2025-01-28T00:46:41.213114 - ------------------------------------------2025-01-28T00:46:41.213114 - 
2025-01-28T00:46:41.213114 - �[34mComfyroll Studio v1.76 : �[92m 175 Nodes Loaded�[0m2025-01-28T00:46:41.213614 - 
2025-01-28T00:46:41.213614 - ------------------------------------------2025-01-28T00:46:41.213614 - 
2025-01-28T00:46:41.213614 - ** For changes, please see patch notes at https://github.com/Suzie1/ComfyUI_Comfyroll_CustomNodes/blob/main/Patch_Notes.md2025-01-28T00:46:41.213614 - 
2025-01-28T00:46:41.213614 - ** For help, please see the wiki at https://github.com/Suzie1/ComfyUI_Comfyroll_CustomNodes/wiki2025-01-28T00:46:41.213614 - 
2025-01-28T00:46:41.214114 - ------------------------------------------2025-01-28T00:46:41.214114 - 
2025-01-28T00:46:41.216614 - 
�[32mInitializing ControlAltAI Nodes�[0m2025-01-28T00:46:41.216614 - 
2025-01-28T00:46:41.553504 - D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\albumentations\__init__.py:13: UserWarning: A new version of Albumentations is available: 2.0.1 (you have 1.4.15). Upgrade using: pip install -U albumentations. To disable automatic update checks, set the environment variable NO_ALBUMENTATIONS_UPDATE to 1.
  check_for_updates()
2025-01-28T00:46:41.702845 - �[34mWAS Node Suite: �[0mImporting styles from `D:\tools\ai\pinokio\api\comfy.git\app\user\default\prompt-styles\sd-styles.csv`.�[0m2025-01-28T00:46:41.702845 - 
2025-01-28T00:46:41.705798 - �[34mWAS Node Suite: �[0mStyles import complete.�[0m2025-01-28T00:46:41.705798 - 
2025-01-28T00:46:42.103776 - �[34mWAS Node Suite: �[0mOpenCV Python FFMPEG support is enabled�[0m2025-01-28T00:46:42.103776 - 
2025-01-28T00:46:42.103776 - �[34mWAS Node Suite �[93mWarning: �[0m`ffmpeg_bin_path` is not set in `D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\pr-was-node-suite-comfyui-47064894\was_suite_config.json` config file. Will attempt to use system ffmpeg binaries if available.�[0m2025-01-28T00:46:42.104275 - 
2025-01-28T00:46:42.490908 - �[34mWAS Node Suite: �[0mFinished.�[0m �[32mLoaded�[0m �[0m220�[0m �[32mnodes successfully.�[0m2025-01-28T00:46:42.491408 - 
2025-01-28T00:46:42.491408 - 
	�[3m�[93m"You have within you right now, everything you need to deal with whatever the world can throw at you."�[0m�[3m - Brian Tracy�[0m
2025-01-28T00:46:42.491408 - 
2025-01-28T00:46:42.502918 - 
2025-01-28T00:46:42.502918 - �[92m[rgthree-comfy] Loaded 42 extraordinary nodes. 🎉�[00m2025-01-28T00:46:42.502918 - 
2025-01-28T00:46:42.502918 - 
2025-01-28T00:46:42.663997 - Traceback (most recent call last):
  File "D:\tools\ai\pinokio\api\comfy.git\app\nodes.py", line 2110, in load_custom_node
    module_spec.loader.exec_module(module)
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\stable-point-aware-3d\__init__.py", line 17, in <module>
    from spar3d.models.mesh import QUAD_REMESH_AVAILABLE, TRIANGLE_REMESH_AVAILABLE
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\stable-point-aware-3d\spar3d\models\mesh.py", line 10, in <module>
    from jaxtyping import Float, Integer
ModuleNotFoundError: No module named 'jaxtyping'

2025-01-28T00:46:42.664999 - Cannot import D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\stable-point-aware-3d module for custom nodes: No module named 'jaxtyping'
2025-01-28T00:46:42.665497 - 
Import times for custom nodes:
2025-01-28T00:46:42.665497 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\canvas_tab
2025-01-28T00:46:42.665992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\websocket_image_save.py
2025-01-28T00:46:42.665992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-styles_csv_loader
2025-01-28T00:46:42.665992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-seamless-tiling
2025-01-28T00:46:42.665992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-inpaint-cropandstitch
2025-01-28T00:46:42.665992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen
2025-01-28T00:46:42.665992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\ComfyUI-GGUF
2025-01-28T00:46:42.665992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui_controlnet_aux
2025-01-28T00:46:42.665992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfy-image-saver
2025-01-28T00:46:42.666497 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui_essentials
2025-01-28T00:46:42.666497 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-depthcrafter-nodes
2025-01-28T00:46:42.666497 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui_ultimatesdupscale
2025-01-28T00:46:42.666497 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui_controlaltai_nodes
2025-01-28T00:46:42.666497 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-frame-interpolation
2025-01-28T00:46:42.666497 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-florence2
2025-01-28T00:46:42.666497 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-liveportraitkj
2025-01-28T00:46:42.666992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\rgthree-comfy
2025-01-28T00:46:42.666992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-hunyuanvideowrapper
2025-01-28T00:46:42.666992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\ComfyUI_Comfyroll_CustomNodes
2025-01-28T00:46:42.666992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-kjnodes
2025-01-28T00:46:42.666992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-impact-pack
2025-01-28T00:46:42.666992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\ComfyUI_Searge_LLM
2025-01-28T00:46:42.666992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-crystools
2025-01-28T00:46:42.666992 -    0.0 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-videohelpersuite
2025-01-28T00:46:42.666992 -    0.2 seconds (IMPORT FAILED): D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\stable-point-aware-3d
2025-01-28T00:46:42.667497 -    0.2 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-easy-use
2025-01-28T00:46:42.667497 -    0.2 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\ComfyUI-Manager
2025-01-28T00:46:42.667497 -    0.3 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-cogvideoxwrapper
2025-01-28T00:46:42.667497 -    0.3 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-ollama
2025-01-28T00:46:42.667497 -    0.4 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui_instantid
2025-01-28T00:46:42.667497 -    0.4 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-itools
2025-01-28T00:46:42.667497 -    0.4 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-art-venture
2025-01-28T00:46:42.667992 -    0.9 seconds: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\pr-was-node-suite-comfyui-47064894
2025-01-28T00:46:42.667992 - 
2025-01-28T00:46:42.679015 - Starting server

2025-01-28T00:46:42.679015 - To see the GUI go to: http://127.0.0.1:8188
2025-01-28T00:46:47.150128 - FETCH ComfyRegistry Data: 5/312025-01-28T00:46:47.150128 - 
2025-01-28T00:46:53.700429 - FETCH ComfyRegistry Data: 10/312025-01-28T00:46:53.700928 - 
2025-01-28T00:47:00.378914 - FETCH ComfyRegistry Data: 15/312025-01-28T00:47:00.379920 - 
2025-01-28T00:47:06.813505 - FETCH ComfyRegistry Data: 20/312025-01-28T00:47:06.814003 - 
2025-01-28T00:47:14.028159 - FETCH ComfyRegistry Data: 25/312025-01-28T00:47:14.028659 - 
2025-01-28T00:47:20.359157 - FETCH ComfyRegistry Data: 30/312025-01-28T00:47:20.359157 - 
2025-01-28T00:47:22.252979 - FETCH ComfyRegistry Data [DONE]2025-01-28T00:47:22.253478 - 
2025-01-28T00:47:22.296933 - [ComfyUI-Manager] default cache updated: https://api.comfy.org/nodes
2025-01-28T00:47:22.311448 - nightly_channel: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/remote
2025-01-28T00:47:22.311956 - FETCH DATA from: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json2025-01-28T00:47:22.311956 - 2025-01-28T00:47:22.367733 -  [DONE]2025-01-28T00:47:22.367733 - 
2025-01-28T00:47:37.342627 - got prompt
2025-01-28T00:47:37.379273 - OmniGen code already exists2025-01-28T00:47:37.379273 - 
2025-01-28T00:47:37.380276 - OmniGen models verified successfully2025-01-28T00:47:37.380276 - 
2025-01-28T00:47:37.485383 - Auto selecting FP16 (Available VRAM: 24.0GB)2025-01-28T00:47:37.485883 - 
2025-01-28T00:47:37.486384 - Current VRAM usage: 0.00MB2025-01-28T00:47:37.486384 - 
2025-01-28T00:47:57.509243 - Loading safetensors2025-01-28T00:47:57.509243 - 
2025-01-28T00:48:02.336776 - Warning: Pipeline.to(device) returned None, using original pipeline2025-01-28T00:48:02.336776 - 
2025-01-28T00:48:02.338775 - VRAM usage after pipeline creation: 15102.28MB2025-01-28T00:48:02.338775 - 
2025-01-28T00:48:02.440538 - Processing with prompt: the woman from <img><|image_1|></img> is sitting in a armchair, cinematic photo, christmas mood, beautiful christmas photo, award winning photography2025-01-28T00:48:02.440538 - 
2025-01-28T00:48:02.441039 - Model will be kept during generation2025-01-28T00:48:02.441039 - 
2025-01-28T00:48:03.098355 - 
  0%|                                                                                                                            | 0/50 [00:00<?, ?it/s]2025-01-28T00:48:03.179908 - 
  0%|                                                                                                                            | 0/50 [00:00<?, ?it/s]2025-01-28T00:48:03.179908 - 
2025-01-28T00:48:03.180526 - Error during generation: cannot unpack non-iterable NoneType object2025-01-28T00:48:03.180526 - 
2025-01-28T00:48:03.220944 - !!! Exception during processing !!! cannot unpack non-iterable NoneType object
2025-01-28T00:48:03.222945 - Traceback (most recent call last):
  File "D:\tools\ai\pinokio\api\comfy.git\app\execution.py", line 327, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "D:\tools\ai\pinokio\api\comfy.git\app\execution.py", line 202, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "D:\tools\ai\pinokio\api\comfy.git\app\execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)
  File "D:\tools\ai\pinokio\api\comfy.git\app\execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\AILab_OmniGen.py", line 383, in generation
    raise e
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\AILab_OmniGen.py", line 349, in generation
    output = pipe(
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\pipeline.py", line 286, in __call__
    samples = scheduler(latents, func, model_kwargs, use_kv_cache=use_kv_cache, offload_kv_cache=offload_kv_cache)
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\scheduler.py", line 164, in __call__
    pred, cache = func(z, timesteps, past_key_values=cache, **model_kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\model.py", line 388, in forward_with_separate_cfg
    temp_out, temp_pask_key_values = self.forward(x[i], timestep[i], input_ids[i], input_img_latents[i], input_image_sizes[i], attention_mask[i], position_ids[i], past_key_values=past_key_values[i], return_past_key_values=True, offload_model=offload_model)
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\model.py", line 338, in forward
    output = self.llm(inputs_embeds=input_emb, attention_mask=attention_mask, position_ids=position_ids, past_key_values=past_key_values, offload_model=offload_model)
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\transformer.py", line 157, in forward
    layer_outputs = decoder_layer(
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 295, in forward
    hidden_states, self_attn_weights = self.self_attn(
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 189, in forward
    cos, sin = position_embeddings
TypeError: cannot unpack non-iterable NoneType object

2025-01-28T00:48:03.224944 - Prompt executed in 25.88 seconds
2025-01-28T00:50:39.679335 - FETCH DATA from: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json2025-01-28T00:50:39.679835 - 2025-01-28T00:50:39.732506 -  [DONE]2025-01-28T00:50:39.732506 - 
2025-01-28T00:50:39.767101 - [ComfyUI-Manager] The ComfyRegistry cache update is still in progress, so an outdated cache is being used.2025-01-28T00:50:39.767676 - 
2025-01-28T00:50:39.784684 - nightly_channel: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/cache
2025-01-28T00:50:39.785179 - FETCH DATA from: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json2025-01-28T00:50:39.785679 - 2025-01-28T00:50:39.844988 -  [DONE]2025-01-28T00:50:39.844988 - 
2025-01-28T00:50:39.869990 - FETCH DATA from: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json2025-01-28T00:50:39.869990 - 2025-01-28T00:50:39.912538 -  [DONE]2025-01-28T00:50:39.913038 - 
2025-01-28T00:50:39.922546 - FETCH DATA from: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extras.json2025-01-28T00:50:39.923046 - 2025-01-28T00:50:40.053957 -  [DONE]2025-01-28T00:50:40.053957 - 
2025-01-28T00:52:34.628313 - FETCH DATA from: D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\ComfyUI-Manager\extension-node-map.json2025-01-28T00:52:34.628813 - 2025-01-28T00:52:34.631813 -  [DONE]2025-01-28T00:52:34.631813 - 
2025-01-28T00:52:35.080142 - Error. No styles.csv found. Put your styles.csv in the root directory of ComfyUI. Then press "Refresh".
                  Your current root directory is: D:\tools\ai\pinokio\api\comfy.git\app
            2025-01-28T00:52:35.080142 - 
2025-01-28T00:52:35.080142 - Error. No styles.csv found. Put your styles.csv in the root directory of ComfyUI. Then press "Refresh".
                  Your current root directory is: D:\tools\ai\pinokio\api\comfy.git\app
            2025-01-28T00:52:35.080142 - 
2025-01-28T00:52:56.453948 - got prompt
2025-01-28T00:52:56.457697 - OmniGen code already exists2025-01-28T00:52:56.457853 - 
2025-01-28T00:52:56.457853 - OmniGen models verified successfully2025-01-28T00:52:56.457853 - 
2025-01-28T00:52:56.458856 - Auto selecting FP16 (Available VRAM: 24.0GB)2025-01-28T00:52:56.458856 - 
2025-01-28T00:52:56.459870 - Current VRAM usage: 8.12MB2025-01-28T00:52:56.459870 - 
2025-01-28T00:53:16.369271 - Loading safetensors2025-01-28T00:53:16.369771 - 
2025-01-28T00:53:21.045372 - Warning: Pipeline.to(device) returned None, using original pipeline2025-01-28T00:53:21.045372 - 
2025-01-28T00:53:21.046872 - VRAM usage after pipeline creation: 15110.41MB2025-01-28T00:53:21.046872 - 
2025-01-28T00:53:21.046872 - Processing with prompt: Create an image of a 20-year-old woman looking directly at the viewer, with a neutral or friendly expression.2025-01-28T00:53:21.046872 - 
2025-01-28T00:53:21.046872 - Model will be kept during generation2025-01-28T00:53:21.047371 - 
2025-01-28T00:53:21.083375 - 
  0%|                                                                                                                            | 0/50 [00:00<?, ?it/s]2025-01-28T00:53:21.085375 - 
  0%|                                                                                                                            | 0/50 [00:00<?, ?it/s]2025-01-28T00:53:21.085375 - 
2025-01-28T00:53:21.085375 - Error during generation: cannot unpack non-iterable NoneType object2025-01-28T00:53:21.085872 - 
2025-01-28T00:53:21.121583 - !!! Exception during processing !!! cannot unpack non-iterable NoneType object
2025-01-28T00:53:21.124083 - Traceback (most recent call last):
  File "D:\tools\ai\pinokio\api\comfy.git\app\execution.py", line 327, in execute
    output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "D:\tools\ai\pinokio\api\comfy.git\app\execution.py", line 202, in get_output_data
    return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
  File "D:\tools\ai\pinokio\api\comfy.git\app\execution.py", line 174, in _map_node_over_list
    process_inputs(input_dict, i)
  File "D:\tools\ai\pinokio\api\comfy.git\app\execution.py", line 163, in process_inputs
    results.append(getattr(obj, func)(**inputs))
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\AILab_OmniGen.py", line 383, in generation
    raise e
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\AILab_OmniGen.py", line 349, in generation
    output = pipe(
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\pipeline.py", line 286, in __call__
    samples = scheduler(latents, func, model_kwargs, use_kv_cache=use_kv_cache, offload_kv_cache=offload_kv_cache)
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\scheduler.py", line 164, in __call__
    pred, cache = func(z, timesteps, past_key_values=cache, **model_kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\utils\_contextlib.py", line 116, in decorate_context
    return func(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\model.py", line 388, in forward_with_separate_cfg
    temp_out, temp_pask_key_values = self.forward(x[i], timestep[i], input_ids[i], input_img_latents[i], input_image_sizes[i], attention_mask[i], position_ids[i], past_key_values=past_key_values[i], return_past_key_values=True, offload_model=offload_model)
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\model.py", line 338, in forward
    output = self.llm(inputs_embeds=input_emb, attention_mask=attention_mask, position_ids=position_ids, past_key_values=past_key_values, offload_model=offload_model)
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\transformer.py", line 157, in forward
    layer_outputs = decoder_layer(
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 295, in forward
    hidden_states, self_attn_weights = self.self_attn(
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
  File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 189, in forward
    cos, sin = position_embeddings
TypeError: cannot unpack non-iterable NoneType object

2025-01-28T00:53:21.126583 - Prompt executed in 24.67 seconds

Attached Workflow

Please make sure that workflow does not contain any sensitive information such as API keys or passwords.

{"last_node_id":11,"last_link_id":13,"nodes":[{"id":3,"type":"PreviewImage","pos":[450,60],"size":[553.608642578125,826.3672485351562],"flags":{},"order":1,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":11,"label":"images"}],"outputs":[],"properties":{"Node name for S&R":"PreviewImage"},"widgets_values":[]},{"id":9,"type":"PreviewImage","pos":[1430,60],"size":[516.1412353515625,824.7374267578125],"flags":{},"order":3,"mode":0,"inputs":[{"name":"images","type":"IMAGE","link":12,"label":"images"}],"outputs":[],"properties":{"Node name for S&R":"PreviewImage"},"widgets_values":[]},{"id":11,"type":"ailab_OmniGen","pos":[1013,60],"size":[400,428],"flags":{},"order":2,"mode":0,"inputs":[{"name":"image_1","type":"IMAGE","link":13,"shape":7,"label":"image_1"},{"name":"image_2","type":"IMAGE","link":null,"shape":7,"label":"image_2"},{"name":"image_3","type":"IMAGE","link":null,"shape":7,"label":"image_3"}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[12],"slot_index":0,"label":"IMAGE"}],"properties":{"Node name for S&R":"ailab_OmniGen"},"widgets_values":["The girl in image_1 sitting on rock on top of the mountain (image_1)","","Auto","Balanced",3.5,1.8,50,true,false,512,512,294120280309690,"randomize",1024]},{"id":10,"type":"ailab_OmniGen","pos":[33,60],"size":[400,428],"flags":{},"order":0,"mode":0,"inputs":[{"name":"image_1","type":"IMAGE","link":null,"shape":7,"label":"image_1"},{"name":"image_2","type":"IMAGE","link":null,"shape":7,"label":"image_2"},{"name":"image_3","type":"IMAGE","link":null,"shape":7,"label":"image_3"}],"outputs":[{"name":"IMAGE","type":"IMAGE","links":[11,13],"slot_index":0,"label":"IMAGE"}],"properties":{"Node name for S&R":"ailab_OmniGen"},"widgets_values":["20yo woman looking at viewer","","Auto","Balanced",3.5,1.8,50,true,false,512,512,70335611382804,"randomize",1024]}],"links":[[11,10,0,3,0,"IMAGE"],[12,11,0,9,0,"IMAGE"],[13,10,0,11,0,"IMAGE"]],"groups":[],"config":{},"extra":{"ds":{"scale":0.6115909044841626,"offset":[209.10111702369386,522.5405398012589]},"node_versions":{"comfy-core":"0.3.12","comfyui-omnigen":"121073508ff04773c57bbdedabf51ff0062190ba"},"VHS_latentpreview":false,"VHS_latentpreviewrate":0},"version":0.4}

Additional Context

(Please add any additional context or steps to reproduce the error here)

@majorchen
Copy link

still have the same issue problem. anyone have fix or solve it? thanks a lot @1038lab

@lingtalfi
Copy link

Side note: as far as testing omnigen locally, i was able to do that using pinokio: it has an omnigen one click install package.
I know it's not directly related to the issue, but just for those who just want to test omnigen locally...

@oldskool978
Copy link

still have the same issue problem. anyone have fix or solve it? thanks a lot @1038lab

try pip install transformers==4.45.2

@zoxuandinhzo
Copy link

still have the same issue problem. anyone have fix or solve it? thanks a lot @1038lab

try pip install transformers==4.45.2

thank you! it works ^^

@vivi-gomez
Copy link

still have the same issue problem. anyone have fix or solve it? thanks a lot @1038lab

try pip install transformers==4.45.2

It works now. I could not make it run with transformers-4.48.3

@allover326
Copy link

Same issue, even with Transformers 4.45.2

@bongobongo2020
Copy link

Works with Transformers 4.45.2 - used on a comfyui install on stabilitymatrix - makes it super easy to change transformers

@pptx-each
Copy link

Same issue, even with Transformers 4.45.2

Can you run it normally

@rahul-akumar
Copy link

i cant run this with any version of the transformer. how can I get more help.

Image

@MC-kakadu
Copy link

Working well with 4.45.2 : )

@RishiAbohariya
Copy link

Yes, Working well with 4.45.2 👍

@ericleigh007
Copy link

if that's the case, then we need to change the setup.py to indicate that exact version as it says >=4.45.2 now.
The requirements.txt file is good though.

For those who don't know the setup.py file is used when you
pip install -e .

Actually the original project seems to make a couple of bad assumptions, for instance not calling out the CUDA versions of the torch* packages.

@Samarth2242
Copy link

still have the same issue problem. anyone have fix or solve it? thanks a lot @1038lab

try pip install transformers==4.45.2

thank you it worked

@0DdoomUs0
Copy link

still have the same issue problem. anyone have fix or solve it? thanks a lot @1038lab

try pip install transformers==4.45.2

Worked for me as well (not using ComfyUI actually, hit this on vanilla OmniGen and happened to find this post)

@reyisok
Copy link

reyisok commented Mar 18, 2025

still have the same issue problem. anyone have fix or solve it? thanks a lot @1038lab

try pip install transformers==4.45.2

requrements file requires transformers>=4.30.0,still issues with version 4.49.0,

@DryIceX
Copy link

DryIceX commented Mar 20, 2025

Issues here too: cannot unpack non-iterable NoneType object

Did pip install.

@DryIceX
Copy link

DryIceX commented Mar 20, 2025

Should I uninstall and then reinstall transformers? Or is there a different node that works like Omnigen?

@DryIceX
Copy link

DryIceX commented Mar 20, 2025

Welp, gave it my best effort. Can't get this to work in ComfyUI so using the standalone ver. If anyone creates a workaround or a different repository node, please link it to me. Much appreciated.

@Roongx
Copy link

Roongx commented Mar 21, 2025

sorry but anyone can share how to go about it in using "pip install transformers==4.45.2"
Thank you

@bozkut
Copy link

bozkut commented Mar 21, 2025

Yes Please, need help on it

sorry but anyone can share how to go about it in using "pip install transformers==4.45.2" Thank you

@bozkut
Copy link

bozkut commented Mar 21, 2025

Issues here too: cannot unpack non-iterable NoneType object

Did pip install.

can you tell where to run this command?

@DryIceX
Copy link

DryIceX commented Mar 21, 2025

Here is what Gemini 2.0 Flash experimental said to me... and it worked on my corp computer. Will try at Home Office later.

D:\ai\ComfyUI_windows_portable_nightly_pytorch\python_embeded\python.exe: This is the path to the Python interpreter that ComfyUI uses. It's crucial to use this specific Python to ensure the library is installed in the correct environment for ComfyUI to find it.
-m pip: This tells Python to run the pip module (the package installer).
install transformers==4.45.2: This is the command for pip to install the transformers library with the exact version specified (4.45.2).

Command being: python.exe -m pip install transformers==4.45.2

This needs to be done in the Python Embeded folder (type cmd and hit enter in the address bar to open the command prompt)

@weizxcz
Copy link

weizxcz commented Mar 22, 2025

i cant run this with the transformer=4.45.2. how can I get more help.

@LitaoGuo
Copy link

Same problem, and transformer=4.45.2 did not work too

@cneva
Copy link

cneva commented Mar 25, 2025

Same issue, even with Transformers 4.45.2...

Help !

Tryed on pinokio / comfyui and windows comfyui
Pinokio / Omnigen (standalone) is working

@Eikwang
Copy link

Eikwang commented Mar 26, 2025

我不能用 transformer=4.45.2 来运行它。我怎样才能获得更多帮助。

试试transformers==4.36.2

@mykeehu
Copy link

mykeehu commented Mar 28, 2025

Same problem, but I have 4.50.2 version.

@Pkhtjim
Copy link

Pkhtjim commented Mar 30, 2025

Tried 4.50 and 4.45.2, no change from transformers version.

@rzw520
Copy link

rzw520 commented Apr 2, 2025

pip install transformers==4.45.2 这个版本非常有用,解决了我的问题,安装时version 4.49.0会自动卸载。

@mykeehu
Copy link

mykeehu commented Apr 2, 2025

However, ComfyUI is already using 4.50.2, so it would be good to update the code.

@theRealAi
Copy link

pip install transformers==4.45.2 didn't solve anything. (4.49 / 4.50 either)

@sruckh-kubra
Copy link

sruckh-kubra commented Apr 3, 2025

Same issue. Not sure I want to downgrade my transformers. The version I currently have installed is 4.50.3.

Error during generation: cannot unpack non-iterable NoneType object
!!! Exception during processing !!! cannot unpack non-iterable NoneType object
Traceback (most recent call last):
File "/workspace/ComfyUI/execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/ComfyUI/execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/ComfyUI/execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "/workspace/ComfyUI/execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/ComfyUI/custom_nodes/ComfyUI-OmniGen/AILab_OmniGen.py", line 383, in generation
raise e
File "/workspace/ComfyUI/custom_nodes/ComfyUI-OmniGen/AILab_OmniGen.py", line 349, in generation
output = pipe(
^^^^^
File "/workspace/miniconda3/envs/comfy/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/workspace/ComfyUI/custom_nodes/ComfyUI-OmniGen/OmniGen/pipeline.py", line 286, in call
samples = scheduler(latents, func, model_kwargs, use_kv_cache=use_kv_cache, offload_kv_cache=offload_kv_cache)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/ComfyUI/custom_nodes/ComfyUI-OmniGen/OmniGen/scheduler.py", line 164, in call
pred, cache = func(z, timesteps, past_key_values=cache, **model_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/miniconda3/envs/comfy/lib/python3.12/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/workspace/ComfyUI/custom_nodes/ComfyUI-OmniGen/OmniGen/model.py", line 388, in forward_with_separate_cfg
temp_out, temp_pask_key_values = self.forward(x[i], timestep[i], input_ids[i], input_img_latents[i], input_image_sizes[i], attention_mask[i], position_ids[i], past_key_values=past_key_values[i], return_past_key_values=True, offload_model=offload_model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/ComfyUI/custom_nodes/ComfyUI-OmniGen/OmniGen/model.py", line 338, in forward
output = self.llm(inputs_embeds=input_emb, attention_mask=attention_mask, position_ids=position_ids, past_key_values=past_key_values, offload_model=offload_model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/miniconda3/envs/comfy/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/miniconda3/envs/comfy/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/ComfyUI/custom_nodes/ComfyUI-OmniGen/OmniGen/transformer.py", line 157, in forward
layer_outputs = decoder_layer(
^^^^^^^^^^^^^^
File "/workspace/miniconda3/envs/comfy/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/miniconda3/envs/comfy/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/miniconda3/envs/comfy/lib/python3.12/site-packages/transformers/models/phi3/modeling_phi3.py", line 301, in forward
hidden_states, self_attn_weights = self.self_attn(
^^^^^^^^^^^^^^^
File "/workspace/miniconda3/envs/comfy/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1739, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/miniconda3/envs/comfy/lib/python3.12/site-packages/torch/nn/modules/module.py", line 1750, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/miniconda3/envs/comfy/lib/python3.12/site-packages/transformers/models/phi3/modeling_phi3.py", line 195, in forward
cos, sin = position_embeddings
^^^^^^^^
TypeError: cannot unpack non-iterable NoneType object

@Onverra-sudo
Copy link

Onverra-sudo commented Apr 14, 2025

Hello

Maybe a workarround,
The problem with "transformers" is caused by big refactor on phi3 model.
huggingface/transformers@2c47618

You can reuse the old phy3 model on the previous merge request before the transformers refactor
https://github.com/huggingface/transformers/tree/5779bac4c45b2c881603cafd20663892869d5860/src/transformers/models/phi3

I put the old phi3 model in a curent transformers instalation (transformers 4.51.3) with a new name "phi3Old" in the folder "ComfyUI\venv\Lib\site-packages\transformers\models"

I rename the file "ComfyUI\venv\Lib\site-packages\transformers\models\phi3Old\configuration_phi3.py" in "ComfyUI\venv\Lib\site-packages\transformers\models\phi3Old\configuration_phi3Old.py"

I rename the file "ComfyUI\venv\Lib\site-packages\transformers\models\phi3Old\modeling_phi3.py" in "ComfyUI\venv\Lib\site-packages\transformers\models\phi3Old\modeling_phi3Old.py"

in "ComfyUI\venv\Lib\site-packages\transformers\models\phi3Old_init_.py" i add "Old"
Line 29: "configuration_phi3Old": ["Phi3ConfigOld"],
Line 29: "configuration_phi3Old": ["Phi3ConfigOld"],
Line 38: _import_structure["modeling_phi3Old"] = [
Line 40: "Phi3ModelOld",
Line 48: from .configuration_phi3Old import Phi3ConfigOld
Line 48: from .configuration_phi3Old import Phi3ConfigOld
Line 56: from .modeling_phi3Old import (
Line 60: Phi3ModelOld,

in "ComfyUI\venv\Lib\site-packages\transformers\models\phi3Old\configuration_phi3Old.py" i add "Old" in following line
Line 25: class Phi3ConfigOld(PretrainedConfig):

in "ComfyUI\venv\Lib\site-packages\transformers\models\phi3Old\modeling_phi3Old.py" i add "Old" in following line
Line 47: from .configuration_phi3Old import Phi3ConfigOld
Line 47: from .configuration_phi3Old import Phi3ConfigOld
Line 53: _CONFIG_FOR_DOC = "Phi3ConfigOld"
Line 305: def init(self, config: Phi3ConfigOld, layer_idx: Optional[int] = None):
Line 655: def init(self, config: Phi3ConfigOld, layer_idx: int):
Line 745: config ([`Phi3ConfigOld`]):
Line 757: config_class = Phi3ConfigOld
Line 859: class Phi3ModelOld(Phi3PreTrainedModel):
Line 864: config: Phi3ConfigOld
Line 867: def init(self, config: Phi3ConfigOld):
Line 1096: config: Phi3ConfigOld,
Line 1118: config (`Phi3ConfigOld`):
Line 1161: self.model = Phi3ModelOld(config)
Line 1350: self.model = Phi3ModelOld(config)
Line 1444: def init(self, config: Phi3ConfigOld):
Line 1448: self.model = Phi3ModelOld(config)

in "ComfyUI\venv\Lib\site-packages\transformers\models_init_" i I add "phi3Old"
Line 216: phi,
Line 217: phi3,
Line 218: phi3Old,
Line 219: phi4_multimodal,
Line 220: phimoe,

in "ComfyUI\venv\Lib\site-packages\transformers_init_" i I add all reference needed on the new folder "phi3phi3Old"
Line 710: "models.phi3Old": ["Phi3ConfigOld"],
Line 3361: _import_structure["models.phi3Old"].extend(
Line 3362: [
Line 3363: "Phi3ModelOld",
Line 3364: ]
Line 3365: )
Line 6010: from .models.phi3Old import Phi3ConfigOld
Line 8280: from .models.phi3Old import (
Line 8281: Phi3ModelOld,
Line 8282: )

and of course in the file ailab_OmniGen "ComfyUI\custom_nodes\OmniGen-ComfyUI\OmniGen\transformer.py" I add "Old" in import and function call
Line 18: from transformers import Phi3ConfigOld, Phi3ModelOld
Line 18: from transformers import Phi3ConfigOld, Phi3ModelOld
Line 27: class Phi3Transformer(Phi3ModelOld):

I don't now why the new phi3 model not work and the old phi3 model work I am not a specialist.
This solution is just workaround, It's very a bad solution but it's work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests