-
Notifications
You must be signed in to change notification settings - Fork 108
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix lora switch #1143
base: main
Are you sure you want to change the base?
fix lora switch #1143
Conversation
WalkthroughThe changes in this pull request involve modifications to the Changes
Poem
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Outside diff range and nitpick comments (1)
onediff_sd_webui_extensions/onediff_lora.py (1)
42-44
: Document the architectural changeThe switch from parameter-based model access to using
onediff_shared.current_unet_graph
represents a significant architectural change. Please add a comment explaining:
- Why this change was necessary for fixing the LoRa switching issue
- How
current_unet_graph
is managed throughout its lifecycle- Any considerations for future maintainers
Add a comment like:
+ # Using global graph module instead of p.sd_model to ensure consistent + # model reference during LoRa switching with ControlNet (fixes #1141) sd_model = onediff_shared.current_unet_graph.graph_module
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (1)
onediff_sd_webui_extensions/onediff_lora.py
(2 hunks)
🔇 Additional comments (2)
onediff_sd_webui_extensions/onediff_lora.py (2)
2-4
: LGTM: Import changes are appropriate
The addition of onediff_shared
import is necessary for the new model access method and is properly placed in the import section.
42-44
: Verify thread safety of global model reference
The change from using a parameter-based model reference (p.sd_model.model.diffusion_model
) to a global reference (onediff_shared.current_unet_graph.graph_module
) might introduce thread safety concerns in concurrent scenarios.
Let's check if there are any concurrent access patterns:
✅ Verification successful
Based on the gathered information, I can now provide the final response:
Thread safety concerns are mitigated by SD WebUI's execution model
The global current_unet_graph
is used within a controlled context where:
- It's initialized once at module import time
- Model switching and compilation happen in a sequential manner through SD WebUI's pipeline
- The graph module is swapped atomically during model initialization
- No concurrent modification patterns were found in the codebase
The change from parameter-based to global reference is safe in this context as SD WebUI processes requests sequentially by design.
🏁 Scripts executed
The following scripts were executed for the analysis:
Script:
#!/bin/bash
# Search for any concurrent processing or threading related code
rg -l "Thread|threading|concurrent|asyncio|multiprocessing"
# Check how current_unet_graph is managed
ast-grep --pattern 'current_unet_graph = $_'
Length of output: 518
Script:
#!/bin/bash
# Let's check the OneDiffCompiledGraph implementation and usage
ast-grep --pattern 'class OneDiffCompiledGraph'
# Check where current_unet_graph is accessed/modified
rg "current_unet_graph" -A 3
# Check if there are any locks or synchronization mechanisms
rg "Lock|Semaphore|RLock|mutex"
Length of output: 4728
Script:
#!/bin/bash
# Let's check the OneDiffCompiledGraph class definition and its usage pattern
cat onediff_sd_webui_extensions/onediff_shared.py
# Check the context where the graph module is used
rg -B 3 "graph_module" onediff_sd_webui_extensions/scripts/onediff.py
# Look for any initialization or setup code
ast-grep --pattern 'def setup($$$)'
Length of output: 728
fix lora cannot switch with controlnet, see more in #1141
Summary by CodeRabbit
New Features
Bug Fixes