首页 > 解决方案 > Logstash 过滤器出错,kibana 不显示新日志

问题描述

我在 Ubuntu 18 中使用两个 conf.d 文件输入和输出安装 Logstash。

一切正常,Kibana 收到了所有日志。但是当我添加 10-syslog-filter.conf 时,我的 kibana 之后没有得到任何新日志。

我的filebeat配置

# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

# ======================= Elasticsearch template setting =======================

#setup.template.settings:
  #index.number_of_shards: 1
  #index.codec: best_compression
  #_source.enabled: false
setup.ilm.overwrite: true

# =================================== Kibana ===================================

# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:
# ------------------------------ Logstash Output -------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["192.168.186.157:5044"]
  index: "filebeat-192.168.186.146"
  bulk_max_size: 1024

# ================================= Processors =================================
processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~


我的系统.yml

- module: system
  # Syslog
  syslog:
    enabled: true

  # Authorization logs
  auth:
    enabled: true

我的输入和输出

input {
  beats {
    host => "0.0.0.0"
    port => 5044
    ssl => false
  }
}
output {
  elasticsearch {
    hosts => ["192.168.186.157:9200"]
    manage_template => false
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
  }
  stdout { codec => rubydebug }
}

10-syslog-filter.conf

  
    if [fileset][name] == "auth" {
      grok {
        match => { "message" => ["%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:\[%{POSINT:[system][auth][pid]}\])?: %{DATA:[system][auth][ssh][event]} %{DATA:[system][auth][ssh][method]} for (invalid user )?%{DATA:[system][auth][user]} from %{IPORHOST:[system][auth][ssh][ip]} port %{NUMBER:[system][auth][ssh][port]} ssh2(: %{GREEDYDATA:[system][auth][ssh][signature]})?",
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:\[%{POSINT:[system][auth][pid]}\])?: %{DATA:[system][auth][ssh][event]} user %{DATA:[system][auth][user]} from %{IPORHOST:[system][auth][ssh][ip]}",
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:\[%{POSINT:[system][auth][pid]}\])?: Did not receive identification string from %{IPORHOST:[system][auth][ssh][dropped_ip]}",
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sudo(?:\[%{POSINT:[system][auth][pid]}\])?: \s*%{DATA:[system][auth][user]} :( %{DATA:[system][auth][sudo][error]} ;)? TTY=%{DATA:[system][auth][sudo][tty]} ; PWD=%{DATA:[system][auth][sudo][pwd]} ; USER=%{DATA:[system][auth][sudo][user]} ; COMMAND=%{GREEDYDATA:[system][auth][sudo][command]}",
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} groupadd(?:\[%{POSINT:[system][auth][pid]}\])?: new group: name=%{DATA:system.auth.groupadd.name}, GID=%{NUMBER:system.auth.groupadd.gid}",
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} useradd(?:\[%{POSINT:[system][auth][pid]}\])?: new user: name=%{DATA:[system][auth][user][add][name]}, UID=%{NUMBER:[system][auth][user][add][uid]}, GID=%{NUMBER:[system][auth][user][add][gid]}, home=%{DATA:[system][auth][user][add][home]}, shell=%{DATA:[system][auth][user][add][shell]}$",
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} %{DATA:[system][auth][program]}(?:\[%{POSINT:[system][auth][pid]}\])?: %{GREEDYMULTILINE:[system][auth][message]}"] }
        pattern_definitions => {
          "GREEDYMULTILINE"=> "(.|\n)*"
        }
        remove_field => "message"
      }
      date {
        match => [ "[system][auth][timestamp]", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
      }
      geoip {
        source => "[system][auth][ssh][ip]"
        target => "[system][auth][ssh][geoip]"
      }
    }
    else if [fileset][name] == "syslog" {
      grok {
        match => { "message" => ["%{SYSLOGTIMESTAMP:[system][syslog][timestamp]} %{SYSLOGHOST:[system][syslog][hostname]} %{DATA:[system][syslog][program]}(?:\[%{POSINT:[system][syslog][pid]}\])?: %{GREEDYMULTILINE:[system][syslog][message]}"] }
        pattern_definitions => { "GREEDYMULTILINE" => "(.|\n)*" }
        remove_field => "message"
      }
      date {
        match => [ "[system][syslog][timestamp]", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
      }
    }
  
}

没有过滤器时的JSON日志,从kibana获取

  "_index": "filebeat-192.168.186.148-7.13.1-2021.07.15",
  "_type": "_doc",
  "_id": "CNoVqHoBm01HzXPOZ5CW",
  "_version": 1,
  "_score": null,
  "fields": {
    "agent.version.keyword": [
      "7.13.1"
    ],
    "host.architecture.keyword": [
      "x86_64"
    ],
    "host.name.keyword": [
      "kali"
    ],
    "event.dataset.keyword": [
      "system.syslog"
    ],
    "host.hostname": [
      "kali"
    ],
    "host.mac": [
      "00:0c:29:0c:60:65"
    ],
    "agent.hostname.keyword": [
      "kali"
    ],
    "service.type": [
      "system"
    ],
    "ecs.version.keyword": [
      "1.9.0"
    ],
    "host.ip.keyword": [
      "192.168.186.148",
      "fe80::20c:29ff:fe0c:6065"
    ],
    "host.os.version": [
      "2020.3"
    ],
    "host.os.name": [
      "Kali GNU/Linux"
    ],
    "agent.name": [
      "kali"
    ],
    "host.id.keyword": [
      "0c42c6c017eb4a808d334aedb1e3f72f"
    ],
    "host.name": [
      "kali"
    ],
    "host.os.version.keyword": [
      "2020.3"
    ],
    "host.os.type": [
      "linux"
    ],
    "agent.id.keyword": [
      "f152b60c-88cf-41a2-9a23-0f0119532b35"
    ],
    "fileset.name": [
      "syslog"
    ],
    "input.type": [
      "log"
    ],
    "@version.keyword": [
      "1"
    ],
    "log.offset": [
      1276740
    ],
    "agent.hostname": [
      "kali"
    ],
    "tags": [
      "beats_input_codec_plain_applied"
    ],
    "host.architecture": [
      "x86_64"
    ],
    "fileset.name.keyword": [
      "syslog"
    ],
    "agent.id": [
      "f152b60c-88cf-41a2-9a23-0f0119532b35"
    ],
    "ecs.version": [
      "1.9.0"
    ],
    "host.containerized": [
      false
    ],
    "event.module.keyword": [
      "system"
    ],
    "host.hostname.keyword": [
      "kali"
    ],
    "agent.version": [
      "7.13.1"
    ],
    "host.os.family": [
      ""
    ],
    "event.timezone.keyword": [
      "-04:00"
    ],
    "service.type.keyword": [
      "system"
    ],
    "input.type.keyword": [
      "log"
    ],
    "tags.keyword": [
      "beats_input_codec_plain_applied"
    ],
    "host.ip": [
      "192.168.186.148",
      "fe80::20c:29ff:fe0c:6065"
    ],
    "agent.type": [
      "filebeat"
    ],
    "event.module": [
      "system"
    ],
    "host.os.kernel.keyword": [
      "5.7.0-kali1-amd64"
    ],
    "host.os.kernel": [
      "5.7.0-kali1-amd64"
    ],
    "@version": [
      "1"
    ],
    "host.os.name.keyword": [
      "Kali GNU/Linux"
    ],
    "host.id": [
      "0c42c6c017eb4a808d334aedb1e3f72f"
    ],
    "log.file.path.keyword": [
      "/var/log/syslog"
    ],
    "event.timezone": [
      "-04:00"
    ],
    "agent.type.keyword": [
      "filebeat"
    ],
    "agent.ephemeral_id.keyword": [
      "e21d75b7-c2c5-48ec-8330-b3bd8beb8309"
    ],
    "host.os.codename.keyword": [
      "kali-rolling"
    ],
    "host.mac.keyword": [
      "00:0c:29:0c:60:65"
    ],
    "agent.name.keyword": [
      "kali"
    ],
    "host.os.codename": [
      "kali-rolling"
    ],
    "message": [
      "Jul 14 22:53:23 kali filebeat[3438]: 2021-07-14T22:53:23.078-0400#011INFO#011[monitoring]#011log/log.go:144#011Non-zero metrics in the last 30s#011{\"monitoring\": {\"metrics\": {\"beat\":{\"cpu\":{\"system\":{\"ticks\":480},\"total\":{\"ticks\":2520,\"time\":{\"ms\":20},\"value\":2520},\"user\":{\"ticks\":2040,\"time\":{\"ms\":20}}},\"handles\":{\"limit\":{\"hard\":524288,\"soft\":1024},\"open\":12},\"info\":{\"ephemeral_id\":\"e21d75b7-c2c5-48ec-8330-b3bd8beb8309\",\"uptime\":{\"ms\":2760089}},\"memstats\":{\"gc_next\":18380848,\"memory_alloc\":11123392,\"memory_total\":365795960,\"rss\":138686464},\"runtime\":{\"goroutines\":37}},\"filebeat\":{\"events\":{\"added\":1,\"done\":1},\"harvester\":{\"open_files\":1,\"running\":1}},\"libbeat\":{\"config\":{\"module\":{\"running\":1}},\"output\":{\"events\":{\"acked\":1,\"active\":0,\"batches\":1,\"total\":1},\"read\":{\"bytes\":12},\"write\":{\"bytes\":1110}},\"pipeline\":{\"clients\":2,\"events\":{\"active\":0,\"published\":1,\"total\":1},\"queue\":{\"acked\":1}}},\"registrar\":{\"states\":{\"current\":32,\"update\":1},\"writes\":{\"success\":1,\"total\":1}},\"system\":{\"load\":{\"1\":0.1,\"15\":0.02,\"5\":0.06,\"norm\":{\"1\":0.1,\"15\":0.02,\"5\":0.06}}}}}}"
    ],
    "host.os.family.keyword": [
      ""
    ],
    "@timestamp": [
      "2021-07-15T02:53:26.202Z"
    ],
    "host.os.type.keyword": [
      "linux"
    ],
    "host.os.platform": [
      "kali"
    ],
    "host.os.platform.keyword": [
      "kali"
    ],
    "log.file.path": [
      "/var/log/syslog"
    ],
    "agent.ephemeral_id": [
      "e21d75b7-c2c5-48ec-8330-b3bd8beb8309"
    ],
    "event.dataset": [
      "system.syslog"
    ]
  },
  "sort": [
    1626317606202
  ]
}```

When I deleted filter file kibana continue to get log again.

What's wrong with that? Does anyone help me, please?

标签: logstashelastic-stacklogstash-configuration

解决方案


推荐阅读