首页 > 解决方案 > 如何为单独的输入类型创建单独的索引

问题描述

我有下面的logstash-syslog.conf文件,它有两种不同的输入类型,一种是 as type => "syslog",另一种是type => "APIC". 所以,我需要两个单独的输出索引创建为syslog-2018.08.25APIC-2018.08.05

我希望动态创建这些索引,我尝试了一些方法index => "%{[type]}-%{+YYYY.MM.dd}",但没有奏效并杀死了logstash。

您能否建议我在下面的配置中做错了什么,需要针对配置和索引类型进行修复。

下面是配置logstash文件:

logstash 版本是:6.2

$ vi logstash-syslog.conf
input {
  file {
    path => [ "/scratch/rsyslog/*/messages.log" ]
    type => "syslog"
  }
  file {
    path => [ "/scratch/rsyslog/Aug/messages.log" ]
    type => "APIC"
  }
}

filter {
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp } %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    }
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }
  if [type] == "APIC" {
    grok {
      match => { "message" => "%{CISCOTIMESTAMP:syslog_timestamp} %{CISCOTIMESTAMP} %{SYSLOGHOST:syslog_hostname} %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
   }
 }
}
output {
              elasticsearch {
                hosts => "noida-elk:9200"
                index => "syslog-%{+YYYY.MM.dd}"
                #index => "%{[type]}-%{+YYYY.MM.dd}"
                document_type => "messages"
  }
}

标签: logstash-groklogstash-configurationlogstash-file

解决方案


对我来说是固定的,因为它对我有用。

 $ cat logstash-syslog.conf
    input {
      file {
        path => [ "/scratch/rsyslog/*/messages.log" ]
        type => "syslog"
      }
      file {
        path => [ "/scratch/rsyslog/Aug/messages.log" ]
        type => "apic_logs"
      }
    }

    filter {
      if [type] == "syslog" {
        grok {
          match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp } %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
          add_field => [ "received_at", "%{@timestamp}" ]
          remove_field => ["@version", "host", "message", "_type", "_index", "_score", "path"]
        }
        syslog_pri { }
        date {
          match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
     }
    }
      if [type] == "apic_logs" {
        grok {
          match => { "message" => "%{CISCOTIMESTAMP:syslog_timestamp} %{CISCOTIMESTAMP} %{SYSLOGHOST:syslog_hostname} (?<prog>[\w._/%-]+) %{SYSLOG5424SD:f1}%{SYSLOG5424SD:f2}%{SYSLOG5424SD:f3}%{SYSLOG5424SD:f4}%{SYSLOG5424SD:f5} %{GREEDYDATA:syslog_message}" }
          add_field => [ "received_at", "%{@timestamp}" ]
          remove_field => ["@version", "host", "message", "_type", "_index", "_score", "path"]
       }
     }
    }
    output {
            if [type] == "syslog" {
            elasticsearch {
                    hosts => "noida-elk:9200"
                    manage_template => false
                    index => "syslog-%{+YYYY.MM.dd}"
                    document_type => "messages"
      }
     }
    }

    output {
            if [type] == "apic_logs" {
            elasticsearch {
                    hosts => "noida-elk:9200"
                    manage_template => false
                    index => "apic_logs-%{+YYYY.MM.dd}"
                    document_type => "messages"
      }
     }
    }

推荐阅读