Skip to content

Commit 1123672

Browse files
authored
[SW-2789] Remove Python 2.7 support (#5625)
* [SW-2789] Remove Python 2.7 support
1 parent 2295829 commit 1123672

File tree

19 files changed

+13
-41
lines changed

19 files changed

+13
-41
lines changed

bin/sparkling-env.sh

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,6 @@ function checkPythonPackages() {
6767
error=0
6868
checkPythonPackage "$packages" "requests"
6969
checkPythonPackage "$packages" "tabulate"
70-
checkPythonPackage "$packages" "future" "0.4.0"
7170

7271
if [ $error == -1 ]; then
7372
exit -1

booklet/src/sections/deployment.tex

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -195,15 +195,14 @@ \subsubsection{Running PySparkling}
195195
\item PySparkling zip file
196196
\item Python Module: request
197197
\item Python Module: tabulate
198-
\item Python Module: future
199198
\end{itemize}
200199

201200
\textbf{Steps}:
202201
\begin{enumerate}
203202
\item Create a new Python library containing the PySparkling zip file.
204203
\item Download the selected Sparkling Water version from \url{https://www.h2o.ai/download/}.
205204
\item The PySparkling zip file is located in the sparkling water zip file at the following location: `py/h2o\_pysparkling\_*.zip.`
206-
\item Create libraries for the following python modules: request, tabulate and future.
205+
\item Create libraries for the following python modules: request, tabulate.
207206
\item Attach the PySparkling library and python modules to the cluster.
208207
\item Create a new python notebook.
209208
\item Create an H2O cluster inside the Spark cluster:

booklet/src/sections/intro.tex

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -145,14 +145,14 @@
145145

146146
\subsection{Sparkling Water Requirements}
147147

148-
Sparkling Water supports Spark 2.1, 2.2, 2.3, 2.4 (except Spark 2.4.2) and 3.0. In specific examples of this
148+
Sparkling Water supports Spark 2.3, 2.4 (except Spark 2.4.2) and 3.0. In specific examples of this
149149
booklet we refer to artifacts for Spark 3.0 and use Sparkling Water 3.30.0.7, however, the Sparkling Water code is
150150
consistent across versions for each Spark.
151151

152152
\begin{itemize}
153153
\item Linux/OS X/Windows
154-
\item Java 1.8 or higher
155-
\item Python 2.7+ For Python version of Sparkling Water (PySparkling)
154+
\item Java 8 or higher
155+
\item Python SUBST_MIN_SUPPORTED_PYTHON+ For Python version of Sparkling Water (PySparkling)
156156
\item R 3.4+ for R version of Sparkling Water (RSparkling)
157157
\item Installed Spark and have SPARK\_HOME environmental variable pointing to its home.
158158
\end{itemize}

booklet/src/sections/starting.tex

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,6 @@ \subsection{Setting up the Environment}
1414
\begin{itemize}
1515
\item requests
1616
\item tabulate
17-
\item future
1817
\end{itemize}.
1918
Also please make sure that your Python environment is set-up to run regular Spark applications.
2019

doc/src/site/sphinx/deployment/azure_hdi.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ The H2O Artificial Intelligence for Azure HDInsight is an application you can in
1515
Create the H2O AI for Azure HDInsight
1616
'''''''''''''''''''''''''''''''''''''
1717

18-
**Requirement**: Python 2.7 or 3.
18+
**Requirements**: Python SUBST_MIN_SUPPORTED_PYTHON+.
1919

2020
Follow the steps below to create a new H2O Artificial Intelligence for Azure HDInsight.
2121

doc/src/site/sphinx/deployment/sw_google_cloud_dataproc.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ In this tutorial we will use Dataproc image version 2.0-debian10 which has Spark
1919
--region $GCP_REGION \
2020
--image-version 2.0-debian10 \
2121
--num-workers 3 \
22-
--properties='^#^dataproc:pip.packages=tabulate==0.8.3,requests==2.21.0,future==0.17.1'
22+
--properties='^#^dataproc:pip.packages=tabulate==0.8.3,requests==2.21.0'
2323
2424
.. content-tabs::
2525

doc/src/site/sphinx/pysparkling.rst

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,7 @@ For example, to install PySparkling for Spark SUBST_SPARK_MAJOR_VERSION, the com
7171
Dependencies
7272
------------
7373

74-
Supported Python versions are Python 2.7 or Python 3+.
74+
Supported Python versions are Python SUBST_MIN_SUPPORTED_PYTHON+.
7575

7676
The major dependency is Spark. Please make sure that your Python environment has functional Spark with all its
7777
dependencies.
@@ -80,7 +80,6 @@ dependencies.
8080
8181
$ pip install requests
8282
$ pip install tabulate
83-
$ pip install future
8483
8584
These dependencies are installed automatically in case PySparkling is installed from PyPI.
8685

gradle-spark2.1.properties

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,4 +6,4 @@ scalaVersion=2.11.12
66
fabricK8sClientVersion=4.6.4
77
executorOverheadMemoryOption=spark.yarn.executor.memoryOverhead
88
driverOverheadMemoryOption=spark.yarn.driver.memoryOverhead
9-
supportedPythonVersions=2.7 3.6
9+
supportedPythonVersions=3.6

gradle-spark2.3.properties

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,4 +6,4 @@ scalaVersion=2.11.12
66
fabricK8sClientVersion=4.6.4
77
executorOverheadMemoryOption=spark.executor.memoryOverhead
88
driverOverheadMemoryOption=spark.driver.memoryOverhead
9-
supportedPythonVersions=2.7 3.6 3.7
9+
supportedPythonVersions=3.6 3.7

gradle-spark2.4.properties

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,4 +7,4 @@ databricksVersion=6.6.x-cpu-ml-scala2.11
77
fabricK8sClientVersion=4.6.4
88
executorOverheadMemoryOption=spark.executor.memoryOverhead
99
driverOverheadMemoryOption=spark.driver.memoryOverhead
10-
supportedPythonVersions=2.7 3.6 3.7
10+
supportedPythonVersions=3.6 3.7

0 commit comments

Comments
 (0)