首页 > 解决方案 > log4j2.properties 调整为仅为 Spark 设置特定的日志级别

问题描述

我正在将我的工作项目的 log4j 更新为 log4j2,并试图掌握在两个 API 之间发生变化的一些语法。

我从他们的网站上提取了一个示例 log4j2.properties,如下所示。由于 Spark 在 INFO 级别非常嘈杂,我需要做的就是设置控制台附加程序以从 'org.apache.spark' 过滤低于 WARN 的日志。在旧的 API 中,这很简单log4j.logger.org.apache.spark=WARN,但现在似乎没那么简单了。

任何建议将不胜感激。

status = error
dest = err
name = PropertiesConfig

property.filename = target/rolling/rollingtest.log

filter.threshold.type = ThresholdFilter
filter.threshold.level = debug

appender.console.type = Console
appender.console.name = STDOUT
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = %d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
appender.console.filter.threshold.type = ThresholdFilter
appender.console.filter.threshold.level = info

appender.rolling.type = RollingFile
appender.rolling.name = RollingFile
appender.rolling.fileName = ${filename}
appender.rolling.filePattern = target/rolling2/test1-%d{MM-dd-yy-HH-mm-ss}-%i.log.gz
appender.rolling.layout.type = PatternLayout
appender.rolling.layout.pattern = %d %p %C{1.} [%t] %m%n
appender.rolling.policies.type = Policies
appender.rolling.policies.time.type = TimeBasedTriggeringPolicy
appender.rolling.policies.time.interval = 2
appender.rolling.policies.time.modulate = true
appender.rolling.policies.size.type = SizeBasedTriggeringPolicy
appender.rolling.policies.size.size=10MB
appender.rolling.strategy.type = DefaultRolloverStrategy
appender.rolling.strategy.max = 5

logger.rolling.name = com.workplace.project
logger.rolling.level = info
logger.rolling.additivity = false
logger.rolling.appenderRef.rolling.ref = RollingFile

rootLogger.level = info
rootLogger.appenderRef.stdout.ref = STDOUT

标签: scalaapache-sparklog4jlog4j2

解决方案


尝试在您的配置文件中添加以下行 -

logger.org.apache.spark.name = org.apache.spark
logger.org.apache.spark.level = warn
logger.org.apache.spark.additivity = false
logger.org.apache.spark.appenderRef.rolling.ref = RollingFile
logger.org.apache.spark.appenderRef.stdout.ref = STDOUT

推荐阅读