scala - Spark-Scala-Intellij java.lang.NoSuchMethodError
问题描述
我正在 Intellij 上使用 Apache Spark 和 Scala。我没有使用 Maven 的经验。我正在尝试组合一个简单的 WordCount 程序并使用 JDK 11、Scala 2.12.12 和 Spark 3.0.1。该项目编译良好,但在运行时,这是我得到的错误:-
Exception in thread "main" java.lang.NoSuchMethodError: 'void scala.util.matching.Regex.<init>(java.lang.String, scala.collection.Seq)'
at scala.collection.immutable.StringLike.r(StringLike.scala:284)
at scala.collection.immutable.StringLike.r$(StringLike.scala:284)
at scala.collection.immutable.StringOps.r(StringOps.scala:33)
at scala.collection.immutable.StringLike.r(StringLike.scala:273)
at scala.collection.immutable.StringLike.r$(StringLike.scala:273)
at scala.collection.immutable.StringOps.r(StringOps.scala:33)
at org.apache.spark.util.Utils$.<init>(Utils.scala:104)
at org.apache.spark.util.Utils$.<clinit>(Utils.scala)
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:75)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:70)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:59)
at WordCount$.main(WordCount.scala:9)
at WordCount.main(WordCount.scala)
我已经检查了导致此错误的最常见原因。spark-core没有provided
依赖,它compile
和 Spark 和 Scala 的版本是最新的。
这是代码: -
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
object WordCount {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Spark Scala WordCount Example").setMaster("local[1]")
val sc = new SparkContext(conf)
var map = sc.textFile("/Users/<username>/Downloads/TestFile.csv").flatMap(line => line.split(",")).map(word => (word,1))
var counts = map.reduceByKey(_ + _)
counts.collect().foreach(println)
sc.stop()
}
}
这是我用过的 pom.xml:-
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.org.example</groupId>
<artifactId>Project</artifactId>
<version>1.0-SNAPSHOT</version>
<inceptionYear>2008</inceptionYear>
<packaging>jar</packaging>
<properties>
<scala.version>2.12.12</scala.version>
<spark.version>3.0.1</spark.version>
</properties>
<repositories>
<repository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</pluginRepository>
</pluginRepositories>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.specs</groupId>
<artifactId>specs</artifactId>
<version>1.2.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>${spark.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>${spark.version}</version>
<scope>compile</scope>
</dependency>
</dependencies>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<resources><resource><directory>src/main/resources</directory></resource></resources>
<plugins>
</plugins>
</build>
<reporting>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
</configuration>
</plugin>
</plugins>
</reporting>
</project>
我之前尝试用 sbt 运行相同的程序,它工作得非常好,但它总是用 maven 返回这个错误。
解决方案
推荐阅读
- c++ - C++ - 向量<> 中的 std::unique_ptr 为 nullptr
- javascript - 根据其悬停的页面部分的背景更改固定导航栏的类
- java - 计算RecyclerView显示的item个数,放到一个TextView中
- java - 使用 VM 选项读取路径中的属性文件
- python-3.x - 尝试在python中列出字典键错误时出错,列表对象不可调用?
- tfs - 如何使用 VS Release Management(本地 TFS)设置文件复制到 FTP 服务器
- excel - VBA 错误 1004 - ChangeFileAccess 方法失败 - Sharepoint 上的文件
- c# - 更改富文本框之前的事件
- c - 厨师和娃娃 MISP
- linux - curl: (77) 自签名 CA 的 SSL CA 证书(路径?访问权限?)有问题