You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[SPARK-48392][CORE] Also load spark-defaults.conf when provided --properties-file
### What changes were proposed in this pull request?
Currently if a property file is provided as argument to Spark submit, the `spark-defaults.conf` is ignored. This PR changes the behavior such that `spark-defaults.conf` is still loaded in this scenario, and any Spark configurations that are in the file but not in the input property file will be loaded.
### Why are the changes needed?
Currently if a property file is provided as argument to Spark submit, the `spark-defaults.conf` is ignored. This causes inconveniences for users who want to split Spark configurations into the two. For example, in Spark on K8S users may want to store system wide default settings in `spark-defaults.conf`, while user specified configurations that are more dynamic in an property file and pass to Spark applications via `--properties-file` parameter. Currently this is not possible. See also kubeflow/spark-operator#1321.
### Does this PR introduce _any_ user-facing change?
Yes, now when a property file is provided via `--properties-file`, `spark-defaults.conf` will also be loaded. However, those configurations specified in the former will take precedence over the same configurations in the latter.
### How was this patch tested?
Existing tests and a new test.
### Was this patch authored or co-authored using generative AI tooling?
No
Closesapache#46709 from sunchao/SPARK-48392.
Authored-by: Chao Sun <[email protected]>
Signed-off-by: Chao Sun <[email protected]>
0 commit comments