首页 > 解决方案 > 实例化 JavaStreamingContext 时出现 AbstractMethodError 异常

问题描述

创建 JavaStreamingContext 时出现 AbstractMethodError 异常。我的依赖pom如下;找不到线索,谁能建议这里出了什么问题?

    <dependency> <!-- Spark dependency -->
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.3.1</version>
        <scope>provided</scope>
    </dependency>
    <dependency> <!-- Spark dependency -->
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.3.1</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>commons-io</groupId>
        <artifactId>commons-io</artifactId>
        <version>2.5</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.11</artifactId>
        <version>2.2.0</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10 -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
        <version>2.0.0</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka -->
    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka_2.10</artifactId>
        <version>0.8.2.2</version>
    </dependency>

org.apache.spark.streaming.scheduler.StreamingListenerBus.(StreamingListenerBus. scala:30) at org.apache.spark.streaming.scheduler.JobScheduler.(JobScheduler.scala:57) at org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:184) at org.apache.spark.streaming .StreamingContext.(StreamingContext.scala:76) 在 org.apache.spark.streaming.api.java.JavaStreamingContext.(JavaStreamingContext.scala:130)

标签: apache-sparkspark-streaming

解决方案


您在这里混合了许多版本的 Spark
首先,如果您使用的是 Apache Spark 2.3.1 和 kafka 0.10+

我建议如下:

<dependency> <!-- Spark dependency -->
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.3.1</version>
        <scope>provided</scope>
    </dependency>
    <dependency> <!-- Spark dependency -->
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.3.1</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>commons-io</groupId>
        <artifactId>commons-io</artifactId>
        <version>2.5</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.11</artifactId>
        <!-- Keep the same Spark version as before -->
        <version>2.3.1</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10 -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
        <!-- Keep the same Spark version as before -->
        <version>2.3.1</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka -->
    <dependency>
        <groupId>org.apache.kafka</groupId>
        <artifactId>kafka_2.10</artifactId>
        <!-- Support for kafka version 0.8 is deprecated as of Spark 2.3.1 and you add the dependencies for kafka 0.10+ above -->
        <version><your_kafka_version_0.10+></version>
    </dependency>

很高兴知道您如何构建/部署您的应用程序?根据您的运行时环境,您可能需要添加一些提供的范围以防止构建的包与现有环境之间发生冲突。

希望能帮助到你


推荐阅读