-
Notifications
You must be signed in to change notification settings - Fork 29.2k
maybe a bug on phi3 model after refactor or not ? #37912
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@Onverra-sudo hey! After that refactor, all models prepare cos/sin for RoPE in the base model and pass it further to each decoder layer as transformers/src/transformers/models/phi3/modeling_phi3.py Lines 579 to 581 in 7a3e208
|
Thanks but i Think it's to complacated for me (models are a new domain for me), after few research the problen is in the Omnigen model not in comfyui node. https://github.com/VectorSpaceLab/OmniGen/tree/main/OmniGen All comfyui node reuse the source code from this model repo. |
@Onverra-sudo not sure if you're the maintainer of OmniGen repo, if not feel free to open issue in OmniGen. The necessary fix is in these lines, where the It has to follow same pattern as the |
Yes I am not the Omnigen maintener and I open an issue on omnigen (VectorSpaceLab/OmniGen#212), I wait a patch by omnigen teams and during this period I will used my workarround base on the previous phi3 model. Thanks for all. |
Great, closing as resolved :) |
Just one last thing, I share with you my manual workarround in the following zip. transformers_patch_phi3old.zip With that after merge with transformers 4.51.3 it's possible to reuse the old phi3 model, and to unblock some people. from transformers import Phi3Config, Phi3Model |
System Info
for your information after the refactor 2c47618
the omnigen node for comfui is broken.
https://github.com/set-soft/ComfyUI_OmniGen_Nodes
I patch manualy transformer to restor the old phi3 model
transformers_patch_phi3old.zip
But it's not a good solution.
example of bug with the new phi3 model from refactor 1038lab/ComfyUI-OmniGen#37 (comment) in forward function.
Can you explain to me how I can update the omnigen to module with the now Phi3 model after transformers refactor ?
Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
install comfyui https://github.com/comfyanonymous/ComfyUI?tab=readme-ov-file
install omnigen module https://github.com/set-soft/ComfyUI_OmniGen_Nodes
lanch omnigen module with 4.51.3
Expected behavior
no error on
File "D:\tools\ai\pinokio\api\comfy.git\app\custom_nodes\comfyui-omnigen\OmniGen\transformer.py", line 157, in forward
layer_outputs = decoder_layer(
File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 295, in forward
hidden_states, self_attn_weights = self.self_attn(
File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1736, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\torch\nn\modules\module.py", line 1747, in _call_impl
return forward_call(*args, **kwargs)
File "D:\tools\ai\pinokio\api\comfy.git\app\env\lib\site-packages\transformers\models\phi3\modeling_phi3.py", line 189, in forward
cos, sin = position_embeddings
TypeError: cannot unpack non-iterable NoneType object
The text was updated successfully, but these errors were encountered: