首页 > 解决方案 > 无法在本地运行 python -m unittest。SPARK_HOME 变量设置不正确

问题描述

我想在将所有测试模块上传到 Jenkins 构建之前一起运行它们,所以我使用python -m unittest tests/*.py devops-config.yml 文件中的命令。

但我收到 FileNotFoundError:

======================================================================
ERROR: setUpClass (tests.test_sdk.TestUtils)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/tests/test_sdk.py", line 26, in setUpClass
    .config("spark.sql.warehouse.dir", warehouse_dir_path) \
  File "/home/lib/python3.6/site-packages/pyspark/sql/session.py", line 173, in getOrCreate
    sc = SparkContext.getOrCreate(sparkConf)
  File "/home/lib/python3.6/site-packages/pyspark/context.py", line 367, in getOrCreate
    SparkContext(conf=conf or SparkConf())
  File "/home/lib/python3.6/site-packages/pyspark/context.py", line 133, in __init__
    SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
  File "/home/lib/python3.6/site-packages/pyspark/context.py", line 316, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway(conf)
  File "/home/lib/python3.6/site-packages/pyspark/java_gateway.py", line 46, in launch_gateway
    return _launch_gateway(conf)
  File "/home/lib/python3.6/site-packages/pyspark/java_gateway.py", line 62, in _launch_gateway
    SPARK_HOME = _find_spark_home()
  File "/home/lib/python3.6/site-packages/pyspark/find_spark_home.py", line 65, in _find_spark_home
    paths = [os.path.abspath(p) for p in paths]
  File "/home/lib/python3.6/site-packages/pyspark/find_spark_home.py", line 65, in <listcomp>
    paths = [os.path.abspath(p) for p in paths]
  File "/usr/lib/python3.6/posixpath.py", line 376, in abspath
    cwd = os.getcwd()
FileNotFoundError: [Errno 2] No such file or directory

当我使用 PyCharm 单独运行每个 .py 文件时,它运行时没有任何错误,但是当我从终端命令行一起运行它们时,所有带有 spark 测试的文件都失败,并出现相同的 FileNotFoundError。

所有测试文件的 Spark 会话设置都相同:

import unittest
from pyspark.sql import SparkSession

class TestUtils(unittest.TestCase):
    spark: SparkSession = None

    @classmethod
    def setUpClass(cls):
        derby_conf = f"-Dderby.system.home=/tmp/derby"
        warehouse_dir_path = f"/tmp/warehouse"
        cls.spark = SparkSession.builder.master('local').enableHiveSupport() \
            .config("spark.driver.extraJavaOptions", derby_conf) \
            .config("spark.sql.warehouse.dir", warehouse_dir_path) \
            .config("spark.driver.host", "127.0.0.1") \
            .getOrCreate()

    @classmethod
    def tearDownClass(cls):
        cls.spark.stop()

    def test_util_function(self):
        self.spark.sql(f"CREATE DATABASE {db}")
        ... some tests...

我还尝试运行:

export PYSPARK_PYTHON="/usr/lib/python3.6" && python -m unittest tests/*.py

与:

export PYSPARK_PYTHON="/usr/bin/python3" && python -m unittest tests/*.py

但没有成功

任何想法我做错了什么?

标签: pythonubuntupyspark

解决方案


你需要这两个

export JAVA_HOME='/Library/Java/JavaVirtualMachines/zulu-11.jdk/Contents/Home/' # update your correct Java home path here
export SPARK_HOME='/home/lib/python3.6/site-packages/pyspark'

推荐阅读