Skip to content

Commit f5fd718

Browse files
committed
comments
1 parent efec497 commit f5fd718

File tree

2 files changed

+9
-5
lines changed

2 files changed

+9
-5
lines changed

core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -125,14 +125,15 @@ private[deploy] class SparkSubmitArguments(args: Seq[String], env: Map[String, S
125125
* When this is called, `sparkProperties` is already filled with configs from the latter.
126126
*/
127127
private def mergeDefaultSparkProperties(): Unit = {
128-
// Honor --conf before the defaults file
128+
// Honor --conf before the specified properties file and defaults file
129129
defaultSparkProperties.foreach { case (k, v) =>
130130
if (!sparkProperties.contains(k)) {
131131
sparkProperties(k) = v
132132
}
133133
}
134134

135135
// Also load properties from `spark-defaults.conf` if they do not exist in the properties file
136+
// and --conf list
136137
val defaultSparkConf = Utils.getDefaultPropertiesFile(env)
137138
Option(defaultSparkConf).foreach { filename =>
138139
val properties = Utils.getPropertiesFromFile(filename)

docs/configuration.md

Lines changed: 7 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -111,12 +111,15 @@ each line consists of a key and a value separated by whitespace. For example:
111111
spark.eventLog.enabled true
112112
spark.serializer org.apache.spark.serializer.KryoSerializer
113113

114+
In addition, a property file with Spark configurations can be passed to `bin/spark-submit` via
115+
the `--properties-file` parameter.
116+
114117
Any values specified as flags or in the properties file will be passed on to the application
115118
and merged with those specified through SparkConf. Properties set directly on the SparkConf
116-
take highest precedence, then flags passed to `spark-submit` or `spark-shell`, then options
117-
in the `spark-defaults.conf` file. A few configuration keys have been renamed since earlier
118-
versions of Spark; in such cases, the older key names are still accepted, but take lower
119-
precedence than any instance of the newer key.
119+
take the highest precedence, then those through `--conf` flags or `--properties-file` passed to
120+
`spark-submit` or `spark-shell`, then options in the `spark-defaults.conf` file. A few
121+
configuration keys have been renamed since earlier versions of Spark; in such cases, the older
122+
key names are still accepted, but take lower precedence than any instance of the newer key.
120123

121124
Spark properties mainly can be divided into two kinds: one is related to deploy, like
122125
"spark.driver.memory", "spark.executor.instances", this kind of properties may not be affected when

0 commit comments

Comments
 (0)