首页 > 解决方案 > 将后端状态配置为使用 hdfs 时出错

问题描述

我正在尝试将后端状态设置为 hdfs

val stateUri = "hdfs/path_to_dir"
val backend: RocksDBStateBackend = new RocksDBStateBackend(stateUri, true)
env.setStateBackend(backend)

我正在使用具有以下依赖项的 flink 1.7.0 运行(我尝试了所有组合):

   "org.apache.flink"    %% "flink-connector-filesystem"         % flinkV
"org.apache.flink"    % "flink-hadoop-fs"                     % flinkV
"org.apache.hadoop"   % "hadoop-hdfs"                         % hadoopVersion
"org.apache.hadoop"   % "hadoop-common"                       % hadoopVersion

但是在运行 jar 时出现此错误:

Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 'hdfs'. The scheme is not directly supported by Flink and no Hadoop file system to support this scheme could be loaded.
    at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:403)
    at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:318)
    at org.apache.flink.core.fs.Path.getFileSystem(Path.java:298)
    at org.apache.flink.runtime.state.filesystem.FsCheckpointStorage.<init>(FsCheckpointStorage.java:58)
    at org.apache.flink.runtime.state.filesystem.FsStateBackend.createCheckpointStorage(FsStateBackend.java:444)
    at org.apache.flink.contrib.streaming.state.RocksDBStateBackend.createCheckpointStorage(RocksDBStateBackend.java:407)
    at org.apache.flink.runtime.checkpoint.CheckpointCoordinator.<init>(CheckpointCoordinator.java:249)
    ... 17 more
Caused by: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Hadoop is not in the classpath/dependencies.
    at org.apache.flink.core.fs.UnsupportedSchemeFactory.create(UnsupportedSchemeFactory.java:64)
    at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:399)
    ... 23 more

任何帮助将不胜感激

标签: scalahadoopapache-flink

解决方案


对于访问路径,只要您在Flink 安装的文件夹中,hdfs://就不必将其与您的作业捆绑在一起。flink-hadoop-fsflink-shaded-hadoop2-uber-1.8-SNAPSHOT.jarlib

如果您的文件夹中没有此依赖lib项,那么我建议将其flink-fs-hadoop-shaded用作依赖项,因为它还会重新定位 Hadoop 依赖项。

此外,重要的是,此依赖项也包含在生成的作业 jar 中。因此,请确保您创建一个带有sbt-assembly插件的 uber-jar。


推荐阅读