首页 > 解决方案 > 在 Flink 集群上运行 Apache Beam 作业时没有翻译器错误

问题描述

我为测试创建了一个非常简单的 apache Beam 作业,它是用 scala 编写的,如下所示:

object Test {
  def main(args: Array[String]): Unit = {
    val options = PipelineOptionsFactory.fromArgs(args: _*).create()
    val p = Pipeline.create(options)

    println(s"--------> $options")

    val printDoFn = new DoFn[String, Void] {
      @ProcessElement
      def processElement(c: ProcessContext): Unit = {
        val e = c.element()
        logger.info(e)
        println(s"===> $e")
      }
    }

    p.apply(Create.of[String]("A", "B", "CCC"))
      .apply(ParDo.of(printDoFn))

    p.run()
  }
}

现在我用官方的 flink docker 镜像部署了一个 flink 集群。

我使用 maven shaded 插件创建了我的测试程序的 uber-jar。

我使用 Job Manager 的 Web UI 界面上传了这个 uber-jar。

我登录到 JobManager 机器,找到上传的 uber-jar,然后运行作业:

flink run -c myapps.Test \
 ./52649b36-aa57-4f2b-95c7-2552fd737ea6_pipeline_beam-1.0.0-SNAPSHOT.jar \
 --runner=FlinkRunner

但我得到了这个错误:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error.
    at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545)
    at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:420)
    at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:404)
    at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:798)
    at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:289)
    at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:215)
    at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1035)
    at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1111)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
    at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
    at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1111)
Caused by: java.lang.IllegalStateException: No translator known for org.apache.beam.sdk.io.Read$Bounded
    at org.apache.beam.runners.core.construction.PTransformTranslation.urnForTransform(PTransformTranslation.java:164)
    at org.apache.beam.runners.flink.FlinkBatchPipelineTranslator.visitPrimitiveTransform(FlinkBatchPipelineTranslator.java:93)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:649)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:649)
    at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:311)
    at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:245)
    at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:458)
    at org.apache.beam.runners.flink.FlinkPipelineTranslator.translate(FlinkPipelineTranslator.java:38)
    at org.apache.beam.runners.flink.FlinkBatchPipelineTranslator.translate(FlinkBatchPipelineTranslator.java:49)
    at org.apache.beam.runners.flink.FlinkPipelineExecutionEnvironment.translate(FlinkPipelineExecutionEnvironment.java:119)
    at org.apache.beam.runners.flink.FlinkRunner.run(FlinkRunner.java:110)
    at org.apache.beam.sdk.Pipeline.run(Pipeline.java:313)
    at org.apache.beam.sdk.Pipeline.run(Pipeline.java:299)
    ...

我认为关键错误是:No translator known for org.apache.beam.sdk.io.Read$Bounded

我用 apache beam 2.7.0 编译了我的程序,从 flink runner 页面:https://beam.apache.org/documentation/runners/flink/,我部署了 flink 1.5.5 版本,带有 flink 官方镜像:flink:1.5.5-hadoop28-scala_2.11-alpine

我在 Google 上找不到任何有用的信息。

标签: apache-flinkapache-beamapache-beam-io

解决方案


我发现了问题。我自己编写了 maven pom 文件,我自己做了阴影插件,我错过了阴影插件中的这一部分:

<transformers>
  <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
</transformers>

现在它起作用了。


推荐阅读