首页 > 解决方案 > 为 Jupyter Notebook 设置 Pyspark:worker 和驱动程序 python 版本不匹配?

问题描述

我很难在本地设置和使用 pyspark。

我有一个与我的 jupyter 笔记本相关联的 conda 环境。以下是我通过终端安装 pyspark 后在终端中输入的内容。

pip install pyspark
pip install findspark

which python3.6

export PYSPARK_DRIVER_PYTHON= #results from 'which python3.6'
export PYSPARK_PYTHON=#results from 'which python3.6'

python --version
# result: Python 3.6.12 :: Anaconda, Inc.


java -version
# java version 1.8.0_25
# SE Runtime Environment (build 1.8.0_25-b17)

pyspark
#... spark version 3.0.1, using python version 3.7.4 (deffault)

这是我试图开始工作的 jupyter notebook 中的代码:

import pyspark
from pyspark.sql.types import *
from pyspark import SparkConf, SparkContext
from pyspark.sql import SQLContext

try:
    conf = pyspark.SparkConf().set('spark.driver.host','127.0.0.1')
    sc = pyspark.SparkContext(master='local', appName='samsApp',conf=conf)
    sqlContext = SQLContext(sc)
    print("Binding")
except ValueError:
    print("Spark session already created")

# below code from stack overflow: how to create pyspark dataframe
cSchema = StructType([StructField("WordList", ArrayType(StringType()))])
test_list = [['Hello', 'world']], [['I', 'am', 'fine']]

df = sqlContext.createDataFrame(test_list,schema=cSchema)

df.show()

上面的最后一行 (df.show()) 产生错误:

Py4JJavaError: An error occurred while calling o41.showString.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, macbook-pro, executor driver): org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  File "/Users/j.doe/anaconda3/envs/package_env/lib/python3.6/site-packages/pyspark/python/lib/pyspark.zip/pyspark/worker.py", line 477, in main
    ("%d.%d" % sys.version_info[:2], version))
Exception: Python in worker has different version 3.7 than that in driver 3.6, PySpark cannot run with different minor versions. Please check environment variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON are correctly set.

我应该如何解决这个问题?我不知道如何调整工人与驱动程序的版本。请提供建议,我没有在网上看到对我有用的直截了当的答案。

标签: pythonpython-3.xapache-sparkpysparkjupyter-notebook

解决方案


试试下面的:

export PYSPARK_PYTHON=<python path>
export PYSPARK_DRIVER_PYTHON=<jupyter path>
pyspark

推荐阅读