首页 > 解决方案 > 在logstash中使用grok模式解析我的json文件?

问题描述

我正在尝试使用logstash将json文件解析为elasticsearch,但我做不到,我想我需要编写一些grok模式。但我做不到。如何使用 logstash 将以下 json 发送到 elasticsearch。

{"机器名":"test1",

"longdate":"2019-01-29 13:19:32",

“级别”:“错误”,

“我的网站”:“test1”,

“消息”:“测试2”,

“异常”:“test3”,

“时间戳”:“2019-01-29T13:19:32.257Z”}

我的logstash文件:


input {
  file {
       path => ["P:/logs/*.txt"]
        start_position => "beginning" 
        discover_interval => 10
        stat_interval => 10
        sincedb_write_interval => 10
        close_older => 10
       codec => multiline { 
        negate => true
        what => "previous" 
       }
  }
}

filter {  
 date {
            match => ["TimeStamp", "ISO8601"]
             }  
    json{
        source => "request"
        target => "parsedJson"

    }   

}   

output {  

    stdout {
        codec => rubydebug
    }



    elasticsearch {
        hosts => [ "http://localhost:9200" ]
         index => "log-%{+YYYY.MM}"

    }   
}



错误:

[2019-01-29T14:30:54,907][WARN][logstash.config.source.multilocal] 忽略“pipelines.yml”文件,因为指定了模块或命令行选项 [2019-01-29T14:30:56,929] [INFO][logstash.runner] 启动 Logstash {"logstash.version"=>"6.3.2"} [2019-01-29T14:30:59,167][ERROR][logstash.agent] 无法执行动作 {:action =>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"输入后第 12 行第 18 列(字节 281)中的 #、{、} 之一{\n 文件 {\n\t 路径 => [\"P:/logs/*.txt\"]\n\t\tstart_position => \"开始\" \n\t\tdiscover_interval => 10\n \t\tstat_interval => 10\n\t\tsincedb_write_interval => 10\n\t\tclose_older => 10\n 编解码器 =>多行 { \n\t\tpattern => \"^%{TIMESTAMP_ISO8601}\\"\n\t\tnegate => true\n what => \"", :backtrace=>["P:/elk/logstash /logstash-core/lib/logstash/compiler.rb:42:incompile_imperative'", "P:/elk/logstash/logstash-core/lib/logstash/compiler.rb:50:incompile_graph'", "P:/elk/logstash/logstash-core/lib/logstash/compiler.rb:12:in block in compile_sources'", "org/jruby/RubyArray.java:2486:inmap'", "P:/elk/logstash/logstash-core/lib/logstash/compiler.rb :11:incompile_sources'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:49:in初始化'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:167:ininitialize'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:in执行'", "P:/elk/logstash/logstash-core/lib/logstash/agent.rb:305:in `block in converge_state'"]} [2019-01-29T14:31:00,417][INFO][ logstash.agent ] 成功启动 Logstash API 端点 {:port=>9600} [2019-01-29T14:34:23,554][WARN ][logstash.config.source.multilocal] 忽略“pipelines.yml”文件,因为模块或指定命令行选项 [2019-01-29T14:34:24,554][INFO][logstash.runner] 启动 Logstash {"logstash.version"=>"6.3.2"} [2019-01-29T14:34:27,486 ][ERROR][logstash.codecs.multiline] 缺少多行编解码器插件所需的设置:

codec { multiline { pattern => # SETTING MISSING ... } } [2019-01-29T14:34:27,502][ERROR][logstash.agent] 无法执行操作 {:action=>LogStash::PipelineAction::Create /pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"您的配置有问题。", :backtrace=>["P:/elk/logstash/logstash-core/lib/logstash /config/mixin.rb:89:inconfig_init'", "P:/elk/logstash/logstash-core/lib/logstash/codecs/base.rb:19:in初始化'", "P:/elk/logstash/logstash-core/lib/logstash/plugins/plugin_factory.rb:97:in plugin'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:110:inplugin'", "(eval):8:在<eval>'", "org/jruby/RubyKernel.java:994:ineval'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:82:ininitialize'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline.rb:167:in初始化'", "P:/elk/logstash/logstash-core/lib/logstash/pipeline_action/创建.rb:40:inexecute'", "P:/elk/logstash/logstash-core/lib/logstash/agent.rb:305:in阻塞在收敛状态'"]} [2019-01-29T14:34:27,971][INFO][logstash.agent] 成功启动 Logstash API 端点 {:port=>9600}

标签: elasticsearchlogstashelastic-stacklogstash-grok

解决方案


您可以尝试将json 过滤器插件用于 logstash。

这样,logstash 中的过滤器插件将解析 json:

filter {
  json {
    source => "message"
  }
}

另一件好事是 tag_on_failure。这样,如果 json 无效或被误解,您将在 elasticsearch/kibana 中看到该消息,但带有 _jsonparsefailure 标签。

  filter {
      json {
        source => "message"
        tag_on_failure => [ "_jsonparsefailure" ]
      }
    }

推荐阅读