-
Notifications
You must be signed in to change notification settings - Fork 128
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Could x-flux run on RTX 3090? #115
Comments
I met this error on 3090 GPU. |
Yes, so do I when using the flux-dev-fp8.safetensors (11 GB). I also tried flux1-dev.safetensors (22 GB), resulting in a memory error. |
Have you tried sd_scripts for FLUX Lora training? https://github.com/kohya-ss/sd-scripts/tree/99744af53afcb750b9a64b7efafe51f3f0da8826 I see it works with 24GB VRAM GPUs. |
Sorry, I didn't. But I have tried ai-toolkit for Flux LoRA training, and tested on web-ui successfully. https://github.com/ostris/ai-toolkit/ Now, I am testing 青龙圣者(Qinglongshengzhe)'s framework. If you are interested in it, don't hesitate to get in touch with me at QQ (593851428). https://www.bilibili.com/video/BV1RW421977P/?spm_id_from=333.788&vd_source=85be3ec5e95fd23384a835c58edf1296 We can do a great job together. |
I try to train x-flux with LoRA on RTX 3090 GPU. But it seems that out of memory.
The text was updated successfully, but these errors were encountered: