首页 > 解决方案 > fluent-plugin-concat 的配置使日志消失

问题描述

我的“fluent-plugin-concat”配置导致我的长日志消失而不是被连接并发送到 Kinesis steam。我使用 fluentd 将日志从部署在 AWS/ECS 上的容器发送到 kinesis 流。(然后到某个地方的 ES 集群)在极少数情况下,一些日志非常大。大多数时候,它们都低于 16K 的 docker 限制。但是,那些罕见的长日志非常重要,我们不想错过它们。

附上我的配置文件。

就在最后的比赛序列之前,我补充说:

<filter>
@type concat
key log
stream_identity_key container_id
partial_key partial_message
partial_value true
separator “”
</filter>

我尝试的另一种配置:使用波纹管选项仅发送第二部分日志太 ES,第一部分只能在 fluentd 日志中看到。将此配置的日志添加为文件。

<filter>
  @type concat
  key log
  stream_identity_key partial_id
  use_partial_metadata true
  separator ""
</filter>
 

and 


<filter>
  @type concat
  key log
  use_partial_metadata true
  separator ""
</filter>

我正在测试的日志也作为 json 文档附加。如果我删除了这个配置,这个日志将分 2 块发送。我究竟做错了什么?(已编辑)

完整的配置文件:

<system>
  log_level info
</system>

# just listen on the unix socket in a dir mounted from host
# input is a json object, with the actual log line in the `log` field
<source>
  @type unix
  path /var/fluentd/fluentd.sock
</source>

# tag log line as json or text
<match service.*.*>
  @type rewrite_tag_filter
  <rule>
    key log
    pattern /.*"logType":\s*"application"/
    tag application.${tag}.json
  </rule>
  <rule>
    key log
    pattern /.*"logType":\s*"exception"/
    tag exception.${tag}.json
  </rule>
  <rule>
    key log
    pattern /.*"logType":\s*"audit"/
    tag audit.${tag}.json
  </rule>
  <rule>
    key log
    pattern /^\{".*\}$/
    tag default.${tag}.json
  </rule>
  <rule>
    key log
    pattern /.+/
    tag default.${tag}.txt
  </rule>
</match>

<filter *.service.*.*.*>
  @type record_transformer
  <record>
    service ${tag_parts[2]}
    childService ${tag_parts[3]}
  </record>
</filter>

<filter *.service.*.*.json>
  @type parser
  key_name log
  reserve_data true
  remove_key_name_field true
  <parse>
    @type json
  </parse>
</filter>

<filter *.service.*.*.*>
  @type record_transformer
  enable_ruby
  <record>
    @timestamp ${ require 'time'; Time.now.utc.iso8601(3) }
  </record>
</filter>


<filter>
  @type concat
  key log
  stream_identity_key container_id
  partial_key partial_message
  partial_value true
  separator ""
</filter>

<match exception.service.*.*.*>
  @type copy
  <store>
    @type kinesis_streams
    region "#{ENV['AWS_DEFAULT_REGION']}"
    stream_name the-name-ex
    debug false

    <instance_profile_credentials>
    </instance_profile_credentials>

    <buffer>
      flush_at_shutdown true
      flush_interval 10
      chunk_limit_size 16m
      flush_thread_interval 1.0
      flush_thread_burst_interval 1.0
      flush_thread_count 1
    </buffer>
  </store>

  <store>
    @type stdout
  </store>
 </match>

<match audit.service.*.*.json>
  @type copy

  <store>
    @type kinesis_streams
    region "#{ENV['AWS_DEFAULT_REGION']}"
    stream_name the-name-sa

    debug false

    <instance_profile_credentials>
    </instance_profile_credentials>

    <buffer>
      flush_at_shutdown true
      flush_interval 1
      chunk_limit_size 16m
      flush_thread_interval 0.1
      flush_thread_burst_interval 0.01
      flush_thread_count 15
    </buffer>
  </store>


  <store>
    @type stdout
  </store>

</match>

<match *.service.*.*.*>
  @type copy
  <store>
    @type kinesis_streams
    region "#{ENV['AWS_DEFAULT_REGION']}"
    stream_name  the-name-apl
    debug false

    <instance_profile_credentials>
    </instance_profile_credentials>

    <buffer>
      flush_at_shutdown true
      flush_interval 10
      chunk_limit_size 16m
      flush_thread_interval 1.0
      flush_thread_burst_interval 1.0
      flush_thread_count 1
    </buffer>
  </store>

  <store>
    @type stdout
  </store>
 </match>



 <match **>
  @type stdout
 </match>

示例日志消息 - 长单行:

{"message": "some message", "longlogtest": "averylongjsonline", "service": "longlog-service", "logType": "application", "log": "aaa .... ( ~18000 chars )..longlogThisIsTheEndOfTheLongLog"}

fluentd-container-log ...仅包含消息的第一部分:以及以下错误消息:

转储错误事件:error_class=Fluent::Plugin::Parser::ParserError error="pattern not match with data

2021-03-05 13:45:41.886672929 +0000 fluent.warn: {"error":"#<Fluent::Plugin::Parser::ParserError: pattern not match with data '{\"message\": \"some message\", \"longlogtest\": \"averylongjsonline\", \"service\": \"longlog-service\", \"logType\": \"application\", \"log\": \"aaaasssssdddddjjjjjjjkkkkkkklllllllwewwwiiiiiilonglogaaaasssssdddddjjjjjjjkkkkkkklllllllwewwwiiiiiilonglogaaaasssssdddddjjjjjjjkkkkkkklllllllwewww
< .....Manay lines of original log erased here......  > 
djjjjjjjkkkkkkklllllllwewwwiiiaaaasssssdddddjjjjjjjkkkkkkklllllllwewwwiiiiiilongloglonglogaaaassss'>","location":null,"tag":"application.service.longlog.none.json","time":1614951941,"record":{"source":"stdout","log":"{\"message\": \"some message\", \"longlogtest\": \"averylongjsonline\", \"service\": \"longlog-service\", \"logType\": \"application\", \"log\": \"aaaasssssdddddjjjjjjjkkkkkkklllllllwewwwiiiiiilonglogaaaasssssdddddjjjjjjjkkkkkkklllllllwewwwiiiiiilonglogaaaasssssdddddjjjjjjjkkkkkkklllllllwewww
< .....Manay lines of original log erased here......  > 
wwiiiiiilonglogaaaasssssdddddjjjjjjjkkkkkkklllllllwewwwiiiaaaasssssdddddjjjjjjjkkkkkkklllllllwewwwiiiiiilongloglonglogaaaassss","partial_message":"true","partial_id":"5c752c1bbfda586f1b867a8ce2274e0ed0418e8e10d5e8602d9fefdb8ad2b7a1","partial_ordinal":"1","partial_last":"false","container_id":"803c0ebe4e6875ea072ce21179e4ac2d12e947b5649ce343ee243b5c28ad595a","container_name":"/ecs-longlog-18-longlog-b6b5ae85ededf4db1f00","service":"longlog","childService":"none"},"message"
:"dump an error event: error_class=Fluent::Plugin::Parser::ParserError error=\"pattern not match with data '{\\\"message\\\": \\\"some message\\\", \\\"longlogtest\\\": \\\"averylongjsonline\\\", \\\"service\\\": \\\"longlog-service\\\", \\\"logType\\\": \\\"application\\\", \\\"log\\\": \\\"aaaasssssdddddjjjjjjjkkkkkkklllllllwewwwiiiiiilonglogaaaasssssdddddjjjjjjjkkkkkkklllllllwewwwiiiiiilonglogaaaasssssdddddjjjjjjjkkkkkkklllllllwewwwiiiiiilonglogaaaasssssdddddjjjjjjjkkkkkkklllllllwewwwiiiiiilonglogaaaasss

标签: dockerloggingamazon-ecsfluentdfluent-docker

解决方案


推荐阅读