Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

load ltx loras trained with finetrainers #6174

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

neph1
Copy link

@neph1 neph1 commented Dec 23, 2024

Hi
I've been playing around with finetuning LTX-Video using finetrainers, and by default the loras won't load in ComfyUI. I noticed the naming of the keys were slightly different, having a "transformer." prepended.

transformer.transformer_blocks.0.attn1.to_k.lora_A.weight

This PR makes them recognized by ComfyUI, but I have no idea whether it's a viable solution, if it applies to other loras than these, or if it should be fixed elsewhere.

@comfyanonymous
Copy link
Owner

This isn't the correct way of doing this.

The correct way is adding an entry like: https://github.com/comfyanonymous/ComfyUI/blob/master/comfy/lora.py#L384

The even more correct way is to convert loras to the ComfyUI format which is the format used when you use the lora save node.

@neph1 neph1 marked this pull request as draft December 23, 2024 08:34
@neph1
Copy link
Author

neph1 commented Dec 23, 2024

Currently this doesn't work. I'll keep working on it, but don't have time right now.

@neph1
Copy link
Author

neph1 commented Dec 25, 2024

This seems to do the trick. Although, again, I won't argue that this is the right way of doing it, or if all LTX loras look like this. Just close it if it's not admissible.

@neph1 neph1 marked this pull request as ready for review December 25, 2024 17:10
@linesword
Copy link

@neph1 I have the same problem. I want to adapt the 0.9.1 model to the lora trained by finetrainers, but it seems that it is not possible at present? How do you deal with it now?

@neph1
Copy link
Author

neph1 commented Dec 29, 2024

Do you mean you have trained with 0.9.1? They have only released the bf16 version of that, right? It's a whole different model, so any loras will need to be trained from scratch.

@linesword
Copy link

Do you mean you have trained with 0.9.1? They have only released the bf16 version of that, right? It's a whole different model, so any loras will need to be trained from scratch.

I train based it on a-r-r-o-w/LTX-Video-diffusers , but I'm not sure if this lora is suitable for the 0.9.1 model

@neph1
Copy link
Author

neph1 commented Dec 29, 2024

I don't think so. They need to release the relevant transformers model, I believe.

@linesword
Copy link

Hi, diffusers can be mixed and loaded, but comfyui cannot be loaded, how do you run it in comfyui? I seem to find that even if you add the modified code, the trained lora cannot run? @neph1

@neph1
Copy link
Author

neph1 commented Jan 2, 2025

I have noticed as well that it doesn't seem to work in latter versions.
My 'stable' version of ComfyUI is at 4e14032. Here it works fine to load loras. But I downloaded a second version to work on this PR, 4b5bcd8, and here it doesn't seem to work very well.

@neph1
Copy link
Author

neph1 commented Jan 4, 2025

@linesword fwiw, I've uploaded my script that renames the keys so that the lora is recognized by comfyui here: https://github.com/neph1/finetrainers-ui/tree/main/scripts

@neph1
Copy link
Author

neph1 commented Jan 11, 2025

I've confirmed that lora loading works with 0.9.1. Example (lora trained with 0.9.0 inferenced with 0.9.1, using comfy built in nodes t2v)
No lora:
ComfyUI_01597_
Lora:
AnimateDiff_00001
Training image:
00068-515901565

(I'd show an animation, but webp isn't supported)

@comfyanonymous Can this be considered for merging?

@yic03685
Copy link

Hi @neph1 thanks so much for working on this. Can you summarize the conclusion about the training and inference? we train on 0.9.1 on finetrainer, with your ui, the keys are properly mapped. Then why do we still need this pr?

@neph1
Copy link
Author

neph1 commented Jan 30, 2025

If you don't need it, great! Maybe something has changed in either diffusers or comfyui to better align the two. I only know that when I wrote it, it was needed. I also got a PM only two days ago saying this PR was the only way they could get their lora working. So maybe something about versions?

@yic03685
Copy link

No I meant I have not tried yet and wonder what’s the steps? So sounds like I still need this pr? Does it work with the latest comfyui build?

@neph1
Copy link
Author

neph1 commented Jan 30, 2025

If you want to use this with the latest comfyui, I guess you need to clone my fork: https://github.com/neph1/ComfyUI/
then:

git checkout master
git pull origin
git checkout fix-loading-LTX-loras
git merge master

and hope there are not conflicts :)
I've synced the fork to the latest changes, but not updated the branch

Edit: Or just copy paste those 6 lines into your lora.py, if you're comfortable with that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants