[SPARK-54456][PYTHON] Import worker module after fork to avoid deadlock #53166
+13
−8
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What changes were proposed in this pull request?
We lazy-import the worker module after fork to avoid potential deadlock caused by importing some modules that spawns multiple threads.
Why are the changes needed?
https://discuss.python.org/t/switching-default-multiprocessing-context-to-spawn-on-posix-as-well/21868
It's impossible to do a thread-safe fork in CPython. CPython started issuing warnings from 3.12 and switched the default
multiprocessingstart method to "spawn" since 3.14.It would be a huge effort for us to give up
forkentirely, but we can try out best to not import random modules before fork by lazy-importing worker module after fork.We already have some workers that import dangerous libraries like
pyarrow-plan_data_source_readfor example.Does this PR introduce any user-facing change?
No
How was this patch tested?
CI should pass.
Was this patch authored or co-authored using generative AI tooling?
No