首页 > 解决方案 > 如何在 spark 中打开 TRACE 日志记录

问题描述

我注意到,每次催化剂改变计划时,Spark 中的 RuleExecutor 都会执行跟踪日志:

https://github.com/apache/spark/blob/78801881c405de47f7e53eea3e0420dd69593dbd/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/rules/RuleExecutor.scala#L93

我想知道的是如何配置 spark 以便打开跟踪日志记录?我正在使用 log4j 并遇到以下文档: https ://jaceklaskowski.gitbooks.io/mastering-apache-spark/spark-logging.html

我一直在研究代码一段时间,我发现您可以设置“log4j.threshold=TRACE”以使部分记录器处于跟踪模式,但是我似乎无法获得催化剂使用的记录器拿起设置。

我究竟做错了什么?

标签: apache-sparklogginglog4j

解决方案


我刚刚尝试了一个简单的结构化流程序从 IntelliJ 中的 Kafka 读取数据,以下语句对我有用,即给了我跟踪级别的日志:

SparkSession.builder().getOrCreate().sparkContext().setLogLevel("TRACE");

下面是显示一些跟踪日志的输出的一部分:

...
...
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Fixed point reached for batch CleanExpressions after 1 iterations.
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Batch CleanExpressions has no effect.
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Fixed point reached for batch CleanExpressions after 1 iterations.
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Batch CleanExpressions has no effect.
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Fixed point reached for batch CleanExpressions after 1 iterations.
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Batch CleanExpressions has no effect.
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Fixed point reached for batch CleanExpressions after 1 iterations.
18/10/12 23:56:02 TRACE package$ExpressionCanonicalizer: Batch CleanExpressions has no effect.
+-----+----+-----+-----------------------+
|topic|key |value|timestamp              |
+-----+----+-----+-----------------------+
|test |null|hi345|2018-10-12 23:56:00.099|
+-----+----+-----+-----------------------+
18/10/12 23:56:02 DEBUG GenerateUnsafeProjection: code for input[0, string, true],input[1, string, true],input[2, string, true],input[3, string, true]:
/* 001 */ public java.lang.Object generate(Object[] references) {
/* 002 */   return new SpecificUnsafeProjection(references);
/* 003 */ }
/* 004 */
/* 005 */ class SpecificUnsafeProjection extends org.apache.spark.sql.catalyst.expressions.UnsafeProjection {
/* 006 */
/* 007 */   private Object[] references;
...
...

希望这可以帮助!


推荐阅读