Skip to content

Use variables in %%configure #834

@sunayansaikia

Description

@sunayansaikia

Is your feature request related to a problem? Please describe.

Currently, in the Pyspark kernel wrapper, we can define %%configure as described below in a notebook cell -

%%configure
{
 "executorCores": 2,
 "conf": {
     "spark.sql.some.key": "a_predefined_fixed_value"
  }
}

However, it seems, it does not allow me to pass a value dynamically at runtime; e.g. in the above configuration, I'd like to be able to have the 'executorCores' and 'spark.sql.some.key' set dynamically; e.g.: may be to something that a function would return.

Describe the solution you'd like
Would it be possible to enable somethig like below (example) or may be something better?

import json
import os

executor_cores = # assume the value will be derived from a function
custom_value = # assume derived from os.environ["CUSTOM_VALUE"]

spark_conf = {
 "executorCores": int(executor_cores),
 "conf": {
     "spark.sql.some.key": str(custom_value)
  }
}
spark_conf_json= json.dumps(spark_conf)

%%configure
"$spark_conf_json"

Describe alternatives you've considered
No alternative solution found

Additional context
No additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions