首页 > 解决方案 > 从 Scala 代码调用 java 泛型 Java 方法时类型不匹配

问题描述

我有 n 数量的 Java 类和一个超类 - 数据模型。类列表是 Scala 方法的输入参数,我想在其中创建 resultStreams 并且必须从 process 方法创建调用 Java 泛型方法。能写出怎么解决吗?我尝试在方法调用中使用 [_ <: SpecificRecordBase], [SpecificRecordBase] ,但结果相同。

错误

Error:(146, 88) type mismatch;
 found   : Class[_$3] where type _$3 <: org.apache.avro.specific.SpecificRecordBase
 required: Class[org.apache.avro.specific.SpecificRecordBase]
Note: _$3 <: org.apache.avro.specific.SpecificRecordBase, but Java-defined class Class is invariant in type T.
You may wish to investigate a wildcard type such as `_ <: org.apache.avro.specific.SpecificRecordBase`. (SLS 3.2.10)
                AvroHelper.deSerializeAvroObject(record.value, cl))(TypeInformation.of(cl)))

斯卡拉代码

object GenerickRunnerStackOverFlow  {
  def process(inputClasses :  List[Class[_ <: SpecificRecordBase]],): Unit = {
    val newStream: DataStream[KafkaSourceType] = env.addSource(....)).uid(...).filter(...)

    val resultStreams = inputClasses .map(
      cl => newStream.map(record =>
                AvroHelper.deSerializeAvroObject(record.value, cl))(TypeInformation.of(cl)))

        ...
  }

    def main(args: Array[String]): Unit = {
        val topicToClasses: List[Class[_ <: SpecificRecordBase]] = List(Types.RECORD_1.getClassType, Types.RECORD_1.getClassType.getClassType)
        process(topicToClasses)

    }
}

Java 方法规范

public static <A extends SpecificRecord> A deSerializeAvroObject(byte[] object, Class<A> clazz){ ...}

模型

    public class Record1 extends SpecificRecordBase {}
    public class Record2 extends SpecificRecordBase {}
    ...
    public enum Types {
      RECORD_1(Record1.class),
      RECORD_2(Record2.class);
      ....

      private Class<? extends SpecificRecordBase> clazz;
      public Class<? extends SpecificRecordBase> getClassType() {return this.clazz;}
}

我也有与 Scala addSink 方法相同的消息错误:

def addSink(sinkFunction : org.apache.flink.streaming.api.functions.sink.SinkFunction[T]) : org.apache.flink.streaming.api.datastream.DataStreamSink[T] = { /* compiled code */ }

我写了包装方法:

def addSinkWithSpecificRecordBase[A <: SpecificRecordBase](
    stream: DataStream[A],
    sink: BucketingSink[A]): DataStreamSink[A] = stream.addSink(sink)

执行结果:

val result = topicToSinkStream.foreach { el =>
  val stream: DataStream[_ <: SpecificRecordBase] = el._2._1
  val sink: BucketingSink[_ <: SpecificRecordBase] = el._2._2
  addSinkWithSpecificRecordBase(stream, sink)
}

有一个错误:

Error:(209, 37) type mismatch;
 found   : org.apache.flink.streaming.api.scala.DataStream[_$9] where type _$9 <: org.apache.avro.specific.SpecificRecordBase
 required: org.apache.flink.streaming.api.scala.DataStream[org.apache.avro.specific.SpecificRecordBase]
Note: _$9 <: org.apache.avro.specific.SpecificRecordBase, but class DataStream is invariant in type T.
You may wish to define T as +T instead. (SLS 4.5)
      addSinkWithSpecificRecordBase(stream, sink)

topicToSinkStream 在哪里:

Map[String, (DataStream[_ <: SpecificRecordBase], BucketingSink[_ <: SpecificRecordBase])]

我还尝试消除方法gemeric描述中的SpecificRecordBase,并在方法参数描述中添加+和-。但是没有结果。

标签: javascalagenericsscala-java-interop

解决方案


问题是AvroHelper.deSerializeAvroObject(record.value, cl)is SpecificRecordBase(的类型_ <: SpecificRecordBase只允许在类型参数中,而不是在这里)。解决方法是提取一个辅助函数:

def processClass[A <: SpecificRecordBase](cl: Class[A], newStream: DataStream[KafkaSourceType]) = 
  newStream.map(record => AvroHelper.deSerializeAvroObject(record.value, cl))(TypeInformation.of(cl)))

(如果您在本地定义它,您也可以使用newStream它而不将其作为参数)然后

val resultStreams = inputClasses.map(cl => processClass(cl, newStream))

推荐阅读