java - Spark application logger
问题描述
I'm trying to let my spark application to log in its own log file, I don't want all my stuff mixed up with that of Spark, it's not readable.
I renounced to use Logback due to Spark libraries compatibilities, so I turned my choice to log4j. I created my custom log4j.properties in src/main/resources of my java application, but when I launched the spark-submit of my jar, all my logs got written in Spark's worker log file. It seams the custom log4j.properties inside my jar got ignored.
This is the command:
./spark-submit --jars /home/user/LIBRERIE/ORACLE/ojdbc8.jar,\
/home/user/.m3/repository/org/mongodb/spark/mongo-spark-connector_2.11/2.3.0/mongo-spark-connector_2.11-2.3.0.jar,\
/home/user/.m3/repository/org/mongodb/mongo-java-driver/3.8.1/mongo-java-driver-3.8.1.jar,\
/home/user/.m3/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar \
--class my.pkg.common.SparkHandlerStandalone \
--master spark://162.16.215.59:7077 \
--deploy-mode cluster \
/home/user/NetBeansProjects/SparkScala/target/SparkScala-1.0-SNAPSHOT.jar
My log4j.properties:
log4j.rootLogger=DEBUG, file
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=/home/user/TEMP/Spark/sparkapp.log
log4j.appender.file.MaxFileSize=5MB
log4j.appender.file.MaxBackupIndex=10
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss.SSS} %-5p %c{1}:%L - %m%n
Does anyone know how I can separate the two logs?
解决方案
请在 log4j 属性文件中为自定义日志创建不同的类别。
log4j.appender.customLog=org.apache.log4j.FileAppender
log4j.appender.customLog.File=/home/user/TEMP/Spark/sparkapp.log
log4j.appender.customLog.layout=org.apache.log4j.PatternLayout
log4j.appender.customLog.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss.SSS} %-5p %c{1}:%L - %m%n
log4j.category.customLog=INFO, debugLog
log4j.additivity.customLog=false
在应用程序代码中,如下配置记录器
static final Logger customLog = Logger.getLogger("customLog");
customLog.info("Test msg")
确保在 extraJava 选项中设置了定制的 log4j 属性。
spark.executor.extraJavaOptions=-Dlog4j.configuration=/home/hadoop/spark-conf/log4j.properties
spark.driver.extraJavaOptions=-Dlog4j.configuration=/home/hadoop/spark-conf/log4j.properties
推荐阅读
- node.js - 事后如何编写 HTTP 500 错误标头
- mql4 - MQL4 只需添加到一张图表即可在所有交易品种上运行 EA
- nginx - SignalR .NET Core 2.1 不适用于代理的 docker 容器
- sql-server - 可以提取存储在 XML 中的列中的 *distinct* 键(跨所有行)吗?
- python - PyQt GUI 项目在其主类之外
- javascript - 如何阅读和理解角度模板解析器错误
- jquery - 为什么这个 jQuery 脚本不起作用?
- python - 根据字典值将字典列表转换为单独的列表
- docker - Portainer docker容器反向代理问题
- html - css/html 适用于本地文件,而不适用于 jsfiddle 或 codepen