首页 > 解决方案 > 使用 kafka 运行 PySpark 流时出错

问题描述

Pyspark 新手。

执行 pyspark 工作人员时出错。

Pyspark 脚本:

import os
os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages org.apache.spark:spark-streaming-kafka-0-8_2.11:2.0.2 pyspark-shell'

from pyspark import SparkContext
from pyspark.streaming import StreamingContext
from pyspark.streaming.kafka import KafkaUtils
import json

sc = SparkContext(appName="PythonSparkStreamingKafka_RM_01")
sc.setLogLevel("WARN")

ssc = StreamingContext(sc, 60)

kafkaStream = KafkaUtils.createStream(ssc, 'localhost:2181', 'test', {'test':1})

parsed = kafkaStream.map(lambda v: v)
parsed.pprint()

ssc.start()
ssc.awaitTermination()

版本:

kafka_2.10-0.8.2.1与操作系统Pyspark version 2.3.0一起使用Python version 3.6.4MacOs sierra

动物园管理员命令:

bin/zookeeper-server-start.sh config/zookeeper.properties

卡夫卡命令:

bin/kafka-server-start.sh config/server.properties

使用加载一些数据到队列

bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test < /tmp/test_log.log

通过运行以下命令验证test队列确实接收队列中的数据(数据确实已加载到测试队列)

bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic test

样本 test_log.log 数据

2018-05-01 13:29:54,287 30337 log_generator.py 145 log-generator DEBUG (over headset)
2018-05-01 13:29:54,951 30337 log_generator.py 139 log-generator ERROR metallic surface begin to become visible. A large dish antenna
2018-05-01 13:29:55,876 30337 log_generator.py 143 log-generator WARNING worked with him before. Here he comes.
2018-05-01 13:29:56,196 30337 log_generator.py 139 log-generator ERROR (to Artoo)

现在,当我通过 spark-submit 或 python pysparkStreaming.py 运行时,我得到了

[Stage 0:>                                                          (0 + 1) / 1]2018-05-01 19:43:50 ERROR Executor:91 - Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.AbstractMethodError
:
:
:
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
2018-05-01 19:43:50 ERROR TaskSetManager:70 - Task 0 in stage 0.0 failed 1 times; aborting job
2018-05-01 19:43:50 ERROR ReceiverTracker:91 - Receiver has been stopped. Try to restart it.

完整的日志https://justpaste.it/522fy

标签: apache-sparkpysparkapache-kafka

解决方案


应该是版本不兼容的问题。

如果您使用的是最新的 spark 版本 (2.3),请检查:

https://spark.apache.org/docs/latest/structured-streaming-programming-guide.html#creating-streaming-dataframes-and-streaming-datasets

Kafka source - Reads data from Kafka. It’s compatible with Kafka broker versions 0.10.0 or higher. See the Kafka Integration Guide for more details.


推荐阅读