首页 > 解决方案 > 执行 Spark Streaming 时出现 java.lang.StackOverflowError

问题描述

我正在做 Spark Streaming 来实时解析一些 kafka 消息。在解析消息之前,我从本地读取了一些文件并构造了两个GridMatrix GMLinkMatcher LM解析很有用的变量。这是我java.lang.StackOverflowError使用以下代码提交时给我的代码spark-submit xxx.jar

public class Stream implements Serializable {
    GridMatrix GM = GridMatrixConstructor.init_Grid_Matrix(0.001);
    LinkMatcher LM = new LinkMatcher();

    public void parse_rdd_record(String[] fields) {
        try {
            System.out.println(InetAddress.getLocalHost().getHostName() + "---->" + Thread.currentThread());
        }
        catch (Exception e) {
            e.printStackTrace();
        }
        System.out.println(LM.GF.toString());
        System.out.println(GM.topleft_x);
    }

    public void Streaming_process() throws Exception {
        SparkConf conf = new SparkConf()
                .setAppName("SparkStreaming")
                .setMaster("local[*]");
        conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
        conf.registerKryoClasses(new Class<?>[]{
                Class.forName("Streaming.Stream")
        });


        JavaSparkContext sc = new JavaSparkContext(conf);
        sc.setLogLevel("WARN");
        JavaStreamingContext ssc = new JavaStreamingContext(sc, new Duration(2000));
        Map<String, Object> kafkaParams = new HashMap<>();
        kafkaParams.put("bootstrap.servers", "xxx.xx.xx.xx:20103,xxx.xx.xx.xx:20104,xxx.xx.xx.xx:20105");
        kafkaParams.put("key.deserializer", StringDeserializer.class);
        kafkaParams.put("value.deserializer", StringDeserializer.class);
        kafkaParams.put("group.id", "use_a_separate_group_id_for_each_stream");
        kafkaParams.put("auto.offset.reset", "latest");
        kafkaParams.put("enable.auto.commit", false);

        Collection<String> topics = Arrays.asList("nc_topic_gis_test");
        JavaInputDStream<ConsumerRecord<String, String>> GPS_DStream =
                KafkaUtils.createDirectStream(
                        ssc,
                        LocationStrategies.PreferConsistent(),
                        ConsumerStrategies.<String, String>Subscribe(topics, kafkaParams)
                );

        JavaPairDStream<String, String> GPS_DStream_Pair =  GPS_DStream.mapToPair(
                (PairFunction<ConsumerRecord<String, String>, String, String>) record ->
                        new Tuple2<>("GPSValue", record.value()));

        GPS_DStream_Pair.foreachRDD(PairRDD -> PairRDD.foreach(rdd -> {
            String[] fields = rdd._2.split(",");
            this.parse_rdd_record(fields);
        }));

        ssc.start();
        ssc.awaitTermination();
    }

    public static void main(String[] args) throws Exception {
        new Stream().Streaming_process();
    }
}

它给了我以下错误:

Exception in thread "streaming-job-executor-0" java.lang.StackOverflowError
        at java.io.Bits.putDouble(Bits.java:121)
        at java.io.ObjectStreamClass$FieldReflector.getPrimFieldValues(ObjectStreamClass.java:2168)
        at java.io.ObjectStreamClass.getPrimFieldValues(ObjectStreamClass.java:1389)
        at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1533)
        at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
        at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
        at java.util.HashMap.internalWriteEntries(HashMap.java:1790)
        at java.util.HashMap.writeObject(HashMap.java:1363)
        at sun.reflect.GeneratedMethodAccessor37.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:1140)
        at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
        at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
        at java.util.HashMap.internalWriteEntries(HashMap.java:1790)
        at java.util.HashMap.writeObject(HashMap.java:1363)
        at sun.reflect.GeneratedMethodAccessor37.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

但是,如果我将GMand更改LM为静态变量,它运行良好。将第 2 行和第 3 行更改为:

private static final GridMatrix GM = GridMatrixConstructor.init_Grid_Matrix(0.001);
private static final LinkMatcher LM = new LinkMatcher();

有人能告诉我为什么它不适用于非静态变量吗?

标签: javaapache-sparkstaticspark-streaming

解决方案


静态和非静态版本之间的区别在于,当非静态时,它将它们作为Stream闭包发送给所有工作人员,而静态时默认情况下不是,除非它由流式 lambda 之一使用,但情况并非如此。

在向工作人员发送时,它试图序列化一个对象。并且根据提供的堆栈跟踪失败。原因很可能是因为这些结构内部有循环声明,无法正确序列化。


推荐阅读