首页 > 解决方案 > 在 pipenv (ubuntu) 上运行 Pyspark 的问题

问题描述

我正在尝试在 Pipenv 中运行 Pyspark。我安装了所需的软件包,但出现以下错误:

我遵循了本指南:https ://youtu.be/MLXOy-OhWRY

(test2-nZrvIDNR) andresg3@andresg3-Lenovo-U430-Touch:~/pipenv/test2$ pyspark
Python 3.7.5 (default, Nov 20 2019, 09:21:52) 
[GCC 9.2.1 20191008] on linux
Type "help", "copyright", "credits" or "license" for more information.
20/04/17 21:32:03 WARN Utils: Your hostname, andresg3-Lenovo-U430-Touch resolves to a loopback address: 127.0.1.1; using 192.168.50.138 instead (on interface wlp2s0)
20/04/17 21:32:03 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
20/04/17 21:32:04 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
20/04/17 21:32:10 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
java.lang.reflect.Constructor.newInstance(Constructor.java:423)
py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
py4j.Gateway.invoke(Gateway.java:238)
py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
py4j.GatewayConnection.run(GatewayConnection.java:238)
java.lang.Thread.run(Thread.java:748)
/usr/local/spark//python/pyspark/shell.py:45: UserWarning: Failed to initialize Spark session.
  warnings.warn("Failed to initialize Spark session.")
Traceback (most recent call last):
  File "/usr/local/spark//python/pyspark/shell.py", line 41, in <module>
    spark = SparkSession._create_shell_session()
  File "/usr/local/spark/python/pyspark/sql/session.py", line 615, in _create_shell_session
    return SparkSession.builder.getOrCreate()
  File "/usr/local/spark/python/pyspark/sql/session.py", line 183, in getOrCreate
    sc = SparkContext.getOrCreate(sparkConf)
  File "/usr/local/spark/python/pyspark/context.py", line 371, in getOrCreate
    SparkContext(conf=conf or SparkConf())
  File "/usr/local/spark/python/pyspark/context.py", line 131, in __init__
    conf, jsc, profiler_cls)
  File "/usr/local/spark/python/pyspark/context.py", line 193, in _do_init
    self._jsc = jsc or self._initialize_context(self._conf._jconf)
  File "/usr/local/spark/python/pyspark/context.py", line 310, in _initialize_context
    return self._jvm.JavaSparkContext(jconf)
  File "/home/andresg3/.local/share/virtualenvs/test2-nZrvIDNR/lib/python3.7/site-packages/py4j/java_gateway.py", line 1525, in __call__
    answer, self._gateway_client, None, self._fqn)
  File "/home/andresg3/.local/share/virtualenvs/test2-nZrvIDNR/lib/python3.7/site-packages/py4j/protocol.py", line 328, in get_return_value
    format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.NoClassDefFoundError: scala/xml/Text
    at org.apache.spark.ui.jobs.AllJobsPage.<init>(AllJobsPage.scala:45)
    at org.apache.spark.ui.jobs.JobsTab.<init>(JobsTab.scala:44)
    at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:59)
    at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:81)
    at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:178)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:480)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
    at py4j.Gateway.invoke(Gateway.java:238)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
    at py4j.GatewayConnection.run(GatewayConnection.java:238)
    at java.lang.Thread.run(Thread.java:748)

点文件:

$ cat Pipfile
[[source]]
url = "https://pypi.python.org/simple"
verify_ssl = true
name = "pypi"

[packages]
pyspark = "==2.4.0"

[dev-packages]

[requires]
python_version = "3.7"

其他信息:

$ pipenv graph
pkg-resources==0.0.0
pyspark==2.4.0
  - py4j [required: ==0.10.7, installed: 0.10.7]

$ python --version
Python 3.7.5

$ java -version
openjdk version "1.8.0_242"
OpenJDK Runtime Environment (build 1.8.0_242-8u242-b08-0ubuntu3~19.10-b08)
OpenJDK 64-Bit Server VM (build 25.242-b08, mixed mode)

$ scala -version
Scala code runner version 2.11.12 -- Copyright 2002-2017, LAMP/EPFL

知道我做错了什么吗?

还:

pyspark --version
20/04/17 21:57:18 WARN Utils: Your hostname, andresg3-Lenovo-U430-Touch resolves to a loopback address: 127.0.1.1; using 192.168.50.138 instead (on interface wlp2s0)
20/04/17 21:57:18 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.0.0-preview
      /_/

Using Scala version 2.12.10, OpenJDK 64-Bit Server VM, 1.8.0_242

标签: pythonapache-sparkpyspark

解决方案


推荐阅读