-
-
Notifications
You must be signed in to change notification settings - Fork 22
TypeError: cannot unpack non-iterable NoneType object #37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I was able to make Omnigen run by virtual env as Shatoi say in his post but never make it working with comfyui |
same problem! |
updated, should fixed your issue. |
got the same error after py updates: ComfyUI Error ReportError Details
Stack Trace
System Information
Devices
Logs
Attached WorkflowPlease make sure that workflow does not contain any sensitive information such as API keys or passwords.
Additional Context(Please add any additional context or steps to reproduce the error here) |
Same problem (using comfy ui via pinokio) ComfyUI Error ReportError Details
Stack Trace
System Information
Devices
Logs
Attached WorkflowPlease make sure that workflow does not contain any sensitive information such as API keys or passwords.
Additional Context(Please add any additional context or steps to reproduce the error here) |
still have the same issue problem. anyone have fix or solve it? thanks a lot @1038lab |
Side note: as far as testing omnigen locally, i was able to do that using pinokio: it has an omnigen one click install package. |
try pip install transformers==4.45.2 |
thank you! it works ^^ |
It works now. I could not make it run with transformers-4.48.3 |
Same issue, even with Transformers 4.45.2 |
Works with Transformers 4.45.2 - used on a comfyui install on stabilitymatrix - makes it super easy to change transformers |
Can you run it normally |
Working well with 4.45.2 : ) |
Yes, Working well with 4.45.2 👍 |
if that's the case, then we need to change the setup.py to indicate that exact version as it says >=4.45.2 now. For those who don't know the setup.py file is used when you Actually the original project seems to make a couple of bad assumptions, for instance not calling out the CUDA versions of the torch* packages. |
thank you it worked |
Worked for me as well (not using ComfyUI actually, hit this on vanilla OmniGen and happened to find this post) |
requrements file requires transformers>=4.30.0,still issues with version 4.49.0, |
Issues here too: cannot unpack non-iterable NoneType object Did pip install. |
Should I uninstall and then reinstall transformers? Or is there a different node that works like Omnigen? |
Welp, gave it my best effort. Can't get this to work in ComfyUI so using the standalone ver. If anyone creates a workaround or a different repository node, please link it to me. Much appreciated. |
sorry but anyone can share how to go about it in using "pip install transformers==4.45.2" |
Yes Please, need help on it
|
can you tell where to run this command? |
Here is what Gemini 2.0 Flash experimental said to me... and it worked on my corp computer. Will try at Home Office later. D:\ai\ComfyUI_windows_portable_nightly_pytorch\python_embeded\python.exe: This is the path to the Python interpreter that ComfyUI uses. It's crucial to use this specific Python to ensure the library is installed in the correct environment for ComfyUI to find it. Command being: python.exe -m pip install transformers==4.45.2 This needs to be done in the Python Embeded folder (type cmd and hit enter in the address bar to open the command prompt) |
i cant run this with the transformer=4.45.2. how can I get more help. |
Same problem, and transformer=4.45.2 did not work too |
Same issue, even with Transformers 4.45.2... Help ! Tryed on pinokio / comfyui and windows comfyui |
试试transformers==4.36.2 |
Same problem, but I have 4.50.2 version. |
Tried 4.50 and 4.45.2, no change from transformers version. |
pip install transformers==4.45.2 这个版本非常有用,解决了我的问题,安装时version 4.49.0会自动卸载。 |
However, ComfyUI is already using 4.50.2, so it would be good to update the code. |
pip install transformers==4.45.2 didn't solve anything. (4.49 / 4.50 either) |
Same issue. Not sure I want to downgrade my transformers. The version I currently have installed is 4.50.3. Error during generation: cannot unpack non-iterable NoneType object |
Hello Maybe a workarround, You can reuse the old phy3 model on the previous merge request before the transformers refactor I put the old phi3 model in a curent transformers instalation (transformers 4.51.3) with a new name "phi3Old" in the folder "ComfyUI\venv\Lib\site-packages\transformers\models" I rename the file "ComfyUI\venv\Lib\site-packages\transformers\models\phi3Old\configuration_phi3.py" in "ComfyUI\venv\Lib\site-packages\transformers\models\phi3Old\configuration_phi3Old.py" I rename the file "ComfyUI\venv\Lib\site-packages\transformers\models\phi3Old\modeling_phi3.py" in "ComfyUI\venv\Lib\site-packages\transformers\models\phi3Old\modeling_phi3Old.py" in "ComfyUI\venv\Lib\site-packages\transformers\models\phi3Old_init_.py" i add "Old" in "ComfyUI\venv\Lib\site-packages\transformers\models\phi3Old\configuration_phi3Old.py" i add "Old" in following line in "ComfyUI\venv\Lib\site-packages\transformers\models\phi3Old\modeling_phi3Old.py" i add "Old" in following line in "ComfyUI\venv\Lib\site-packages\transformers\models_init_" i I add "phi3Old" in "ComfyUI\venv\Lib\site-packages\transformers_init_" i I add all reference needed on the new folder "phi3phi3Old" and of course in the file ailab_OmniGen "ComfyUI\custom_nodes\OmniGen-ComfyUI\OmniGen\transformer.py" I add "Old" in import and function call I don't now why the new phi3 model not work and the old phi3 model work I am not a specialist. |
I just reinstall my comfyui trying to use omnigen custom node but it fails
I'm newbie in python;codes
helps pls I need to fix it soon as possible
ailab_OmniGen
cannot unpack non-iterable NoneType object
got prompt
<|image_1|> into an oil painting, giving it a textured, classic style with visible brushstrokes and rich color.
Downloading OmniGen code from GitHub...
Downloaded model.py
Downloaded pipeline.py
Downloaded processor.py
Downloaded scheduler.py
Downloaded transformer.py
Downloaded utils.py
Downloaded init.py
OmniGen code setup completed
OmniGen models verified successfully
Current model instance: None
Current model precision: None
Loading safetensors
Warning: Pipeline.to(device) returned None, using original pipeline
VRAM usage after pipeline creation: 15102.28MB
Processing with prompt: Transform
Model will be kept during generation
0%| | 0/25 [00:00<?, ?it/s]
Error during generation: cannot unpack non-iterable NoneType object
!!! Exception during processing !!! cannot unpack non-iterable NoneType object
Traceback (most recent call last):
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\execution.py", line 327, in execute
output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\execution.py", line 202, in get_output_data
return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\execution.py", line 174, in _map_node_over_list
process_inputs(input_dict, i)
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\execution.py", line 163, in process_inputs
results.append(getattr(obj, func)(**inputs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-OmniGen\ailab_OmniGen.py", line 387, in generation
raise e
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-OmniGen\ailab_OmniGen.py", line 353, in generation
output = pipe(
^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\pipeline.py", line 286, in call
samples = scheduler(latents, func, model_kwargs, use_kv_cache=use_kv_cache, offload_kv_cache=offload_kv_cache)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\scheduler.py", line 164, in call
pred, cache = func(z, timesteps, past_key_values=cache, **model_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\utils_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\model.py", line 388, in forward_with_separate_cfg
temp_out, temp_pask_key_values = self.forward(x[i], timestep[i], input_ids[i], input_img_latents[i], input_image_sizes[i], attention_mask[i], position_ids[i], past_key_values=past_key_values[i], return_past_key_values=True, offload_model=offload_model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\model.py", line 338, in forward
output = self.llm(inputs_embeds=input_emb, attention_mask=attention_mask, position_ids=position_ids, past_key_values=past_key_values, offload_model=offload_model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-OmniGen\OmniGen\transformer.py", line 157, in forward
layer_outputs = decoder_layer(
^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 295, in forward
hidden_states, self_attn_weights = self.self_attn(
^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\ComfyUiPortable\ComfyUI_windows_portable\python_embeded\Lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 189, in forward
cos, sin = position_embeddings
^^^^^^^^
TypeError: cannot unpack non-iterable NoneType object
Prompt executed in 304.61 seconds
^^^^^^^^
TypeError: cannot unpack non-iterable NoneType object
Prompt executed in 304.61 seconds
The text was updated successfully, but these errors were encountered: