Logstash goes wrong after add filter

Hello everyone,

I install Logstash in Ubuntu 18 with two conf.d file input and output.

Everything run normally. But when I add 10-syslog-filter.conf something went wrong.

02-beat-input.conf

input {
  beats {
    host => "0.0.0.0"
    port => 5044
    ssl => false
  }
}

30-elasticsearch-output.conf

output {
  elasticsearch {
    hosts => ["192.168.186.157:9200"]
    manage_template => false
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
  }
  stdout { codec => rubydebug }
}

10-syslog-filter.conf

filter {
  
    if [fileset][name] == "auth" {
      grok {
        match => { "message" => ["%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:\[%{POSINT:[system][auth][pid]}\])?: %{DATA:[system][auth][ssh][event]} %{DATA:[system][auth][ssh][method]} for (invalid user )?%{DATA:[system][auth][user]} from %{IPORHOST:[system][auth][ssh][ip]} port %{NUMBER:[system][auth][ssh][port]} ssh2(: %{GREEDYDATA:[system][auth][ssh][signature]})?",
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:\[%{POSINT:[system][auth][pid]}\])?: %{DATA:[system][auth][ssh][event]} user %{DATA:[system][auth][user]} from %{IPORHOST:[system][auth][ssh][ip]}",
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:\[%{POSINT:[system][auth][pid]}\])?: Did not receive identification string from %{IPORHOST:[system][auth][ssh][dropped_ip]}",
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sudo(?:\[%{POSINT:[system][auth][pid]}\])?: \s*%{DATA:[system][auth][user]} :( %{DATA:[system][auth][sudo][error]} ;)? TTY=%{DATA:[system][auth][sudo][tty]} ; PWD=%{DATA:[system][auth][sudo][pwd]} ; USER=%{DATA:[system][auth][sudo][user]} ; COMMAND=%{GREEDYDATA:[system][auth][sudo][command]}",
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} groupadd(?:\[%{POSINT:[system][auth][pid]}\])?: new group: name=%{DATA:system.auth.groupadd.name}, GID=%{NUMBER:system.auth.groupadd.gid}",
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} useradd(?:\[%{POSINT:[system][auth][pid]}\])?: new user: name=%{DATA:[system][auth][user][add][name]}, UID=%{NUMBER:[system][auth][user][add][uid]}, GID=%{NUMBER:[system][auth][user][add][gid]}, home=%{DATA:[system][auth][user][add][home]}, shell=%{DATA:[system][auth][user][add][shell]}$",
                  "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} %{DATA:[system][auth][program]}(?:\[%{POSINT:[system][auth][pid]}\])?: %{GREEDYMULTILINE:[system][auth][message]}"] }
        add_field => [ "activity", "SSH Login" ]
	add_tag => ["linux_auth"]
        remove_field => "message"
      }
      date {
        match => [ "[system][auth][timestamp]", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
      }
      geoip {
        source => "[system][auth][ssh][ip]"
        target => "[system][auth][ssh][geoip]"
      }
    }
    else if [fileset][name] == "syslog" {
      grok {
        match => { "message" => ["%{SYSLOGTIMESTAMP:[system][syslog][timestamp]} %{SYSLOGHOST:[system][syslog][hostname]} %{DATA:[system][syslog][program]}(?:\[%{POSINT:[system][syslog][pid]}\])?: %{GREEDYMULTILINE:[system][syslog][message]}"] }

        remove_field => "message"
      }
      date {
        match => [ "[system][syslog][timestamp]", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
      }
    }
  
}

After add filter I run logstash /usr/share/logstash/bin/logstash --path.settings /etc/logstash -t

Configuration OK
[2021-07-15T22:04:14,707][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash

But /var/log/logstash/logstash-plain.log had some error

[2021-07-15T22:03:53,920][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2021-07-15T22:03:54,129][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2021-07-15T22:03:56,481][INFO ][logstash.runner          ] Logstash shut down.
[2021-07-15T22:03:56,584][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
	at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.16.0.jar:?]
	at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.16.0.jar:?]
	at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:89) ~[?:?]
[2021-07-15T22:04:01,470][INFO ][org.reflections.Reflections] Reflections took 871 ms to scan 1 urls, producing 24 keys and 48 values 
[2021-07-15T22:04:14,707][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash

And kibana didn't receive any new log.

What's wrong with conf file?

Have you tried increasing the logging level of logstash to see where it may be crashing?
I don’t see any obvious problems with your config.

I would expect a different error message for it, but GREEDYMULTILINE is not a valid grok pattern by default.

Your grok filters are going to be very expensive, and may even timeout. Read this blog to understand how to fix that by anchoring them.

After add
logger.elasticsearchoutput.name = logstash.outputs.elasticsearch logger.elasticsearchoutput.level = debug to /etc/logstash/log4j2.properties and restart Lostash, I run sudo -u logstash /usr/share/logstash/bin/logstash --path.settings /etc/logstash -t and it went on

Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2021-07-16T08:35:56,266][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2021-07-16T08:35:56,285][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.13.1", "jruby.version"=>"jruby 9.2.16.0 (2.5.7) 2021-03-03 f82228dc32 OpenJDK 64-Bit Server VM 11.0.11+9 on 11.0.11+9 +indy +jit [linux-x86_64]"}
[2021-07-16T08:35:59,306][INFO ][org.reflections.Reflections] Reflections took 117 ms to scan 1 urls, producing 24 keys and 48 values 
[2021-07-16T08:36:00,769][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
[2021-07-16T08:36:00,778][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = false
[2021-07-16T08:36:00,778][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "120ab5d39bef109a010a87c931c2c2c59e1bebf1bd4378d2a39c62f1ae361d87"
[2021-07-16T08:36:00,793][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [//192.168.186.157:9200]
[2021-07-16T08:36:00,793][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2021-07-16T08:36:00,793][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_cb1523cc-6013-45e6-8450-a61af6b2db7b", enable_metric=>true, charset=>"UTF-8">
[2021-07-16T08:36:00,802][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2021-07-16T08:36:00,803][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2021-07-16T08:36:00,804][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2021-07-16T08:36:00,804][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2021-07-16T08:36:00,804][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2021-07-16T08:36:00,808][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2021-07-16T08:36:00,809][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2021-07-16T08:36:00,812][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2021-07-16T08:36:00,812][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2021-07-16T08:36:00,814][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2021-07-16T08:36:00,818][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2021-07-16T08:36:00,818][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@custom_headers = {}
[2021-07-16T08:36:00,819][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2021-07-16T08:36:00,819][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2021-07-16T08:36:00,819][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_type = "logs"
[2021-07-16T08:36:00,820][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_dataset = "generic"
[2021-07-16T08:36:00,820][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_namespace = "default"
[2021-07-16T08:36:00,820][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_sync_fields = true
[2021-07-16T08:36:00,821][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_auto_routing = true
[2021-07-16T08:36:00,821][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2021-07-16T08:36:00,822][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2021-07-16T08:36:00,827][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2021-07-16T08:36:00,832][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2021-07-16T08:36:00,832][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2021-07-16T08:36:00,838][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2021-07-16T08:36:00,838][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2021-07-16T08:36:00,838][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2021-07-16T08:36:00,839][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2021-07-16T08:36:00,839][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2021-07-16T08:36:00,839][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2021-07-16T08:36:00,839][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2021-07-16T08:36:00,842][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_enabled = "auto"
[2021-07-16T08:36:00,843][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_pattern = "{now/d}-000001"
[2021-07-16T08:36:00,843][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_policy = "logstash-policy"
Configuration OK
[2021-07-16T08:36:01,210][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash

And /var/log/logstash/logstash-plain.log went

[2021-07-16T08:35:56,266][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2021-07-16T08:35:56,285][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.13.1", "jruby.version"=>"jruby 9.2.16.0 (2.5.7) 2021-03-03 f82228dc32 OpenJDK 64-Bit Server VM 11.0.11+9 on 11.0.11+9 +indy +jit [linux-x86_64]"}
[2021-07-16T08:35:59,306][INFO ][org.reflections.Reflections] Reflections took 117 ms to scan 1 urls, producing 24 keys and 48 values 
[2021-07-16T08:36:00,769][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
[2021-07-16T08:36:00,778][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = false
[2021-07-16T08:36:00,778][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "120ab5d39bef109a010a87c931c2c2c59e1bebf1bd4378d2a39c62f1ae361d87"
[2021-07-16T08:36:00,793][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [//192.168.186.157:9200]
[2021-07-16T08:36:00,793][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2021-07-16T08:36:00,793][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_cb1523cc-6013-45e6-8450-a61af6b2db7b", enable_metric=>true, charset=>"UTF-8">
[2021-07-16T08:36:00,802][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2021-07-16T08:36:00,803][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2021-07-16T08:36:00,804][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2021-07-16T08:36:00,804][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2021-07-16T08:36:00,804][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2021-07-16T08:36:00,808][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2021-07-16T08:36:00,809][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2021-07-16T08:36:00,812][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2021-07-16T08:36:00,812][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2021-07-16T08:36:00,814][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2021-07-16T08:36:00,818][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2021-07-16T08:36:00,818][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@custom_headers = {}
[2021-07-16T08:36:00,819][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2021-07-16T08:36:00,819][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2021-07-16T08:36:00,819][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_type = "logs"
[2021-07-16T08:36:00,820][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_dataset = "generic"
[2021-07-16T08:36:00,820][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_namespace = "default"
[2021-07-16T08:36:00,820][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_sync_fields = true
[2021-07-16T08:36:00,821][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_auto_routing = true
[2021-07-16T08:36:00,821][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2021-07-16T08:36:00,822][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2021-07-16T08:36:00,827][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2021-07-16T08:36:00,832][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2021-07-16T08:36:00,832][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2021-07-16T08:36:00,838][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2021-07-16T08:36:00,838][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2021-07-16T08:36:00,838][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2021-07-16T08:36:00,839][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2021-07-16T08:36:00,839][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2021-07-16T08:36:00,839][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2021-07-16T08:36:00,839][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2021-07-16T08:36:00,842][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_enabled = "auto"
[2021-07-16T08:36:00,843][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_pattern = "{now/d}-000001"
[2021-07-16T08:36:00,843][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_policy = "logstash-policy"
[2021-07-16T08:36:01,210][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2021-07-16T08:36:10,440][DEBUG][logstash.outputs.elasticsearch][main][e062dfdfdf8c6e82835f771c0907bc5b4849e0fbbf0467055b4700621cfe4f5d] Sending final bulk request for batch. {:action_count=>1, :payload_size=>2346, :content_length=>2346, :batch_offset=>0}
[2021-07-16T08:36:45,472][DEBUG][logstash.outputs.elasticsearch][main][e062dfdfdf8c6e82835f771c0907bc5b4849e0fbbf0467055b4700621cfe4f5d] Sending final bulk request for batch. {:action_count=>1, :payload_size=>2291, :content_length=>2291, :batch_offset=>0}

When I anchoring them the log goes

32 OpenJDK 64-Bit Server VM 11.0.11+9 on 11.0.11+9 +indy +jit [linux-x86_64]"}
[2021-07-16T08:58:28,911][INFO ][org.reflections.Reflections] Reflections took 158 ms to scan 1 urls, producing 24 keys and 48 values 
[2021-07-16T08:58:29,879][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2021-07-16T08:58:31,911][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
[2021-07-16T08:58:31,915][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = false
[2021-07-16T08:58:31,919][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "52a930d67c6e8ae4917d8f70cc8be4a786b23c579dd073b9e52c1123807dde6d"
[2021-07-16T08:58:31,952][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [//192.168.186.157:9200]
[2021-07-16T08:58:31,966][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2021-07-16T08:58:31,967][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_5d02d598-fafa-44b9-9e4d-849f89733445", enable_metric=>true, charset=>"UTF-8">
[2021-07-16T08:58:31,982][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2021-07-16T08:58:31,999][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2021-07-16T08:58:32,002][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2021-07-16T08:58:32,010][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2021-07-16T08:58:32,010][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2021-07-16T08:58:32,018][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2021-07-16T08:58:32,024][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2021-07-16T08:58:32,024][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2021-07-16T08:58:32,041][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2021-07-16T08:58:32,045][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2021-07-16T08:58:32,045][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2021-07-16T08:58:32,048][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@custom_headers = {}
[2021-07-16T08:58:32,050][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2021-07-16T08:58:32,050][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2021-07-16T08:58:32,051][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_type = "logs"
[2021-07-16T08:58:32,053][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_dataset = "generic"
[2021-07-16T08:58:32,055][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_namespace = "default"
[2021-07-16T08:58:32,056][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_sync_fields = true
[2021-07-16T08:58:32,058][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_auto_routing = true
[2021-07-16T08:58:32,059][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2021-07-16T08:58:32,090][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2021-07-16T08:58:32,090][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2021-07-16T08:58:32,091][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2021-07-16T08:58:32,091][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2021-07-16T08:58:32,092][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2021-07-16T08:58:32,092][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2021-07-16T08:58:32,093][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2021-07-16T08:58:32,094][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2021-07-16T08:58:32,098][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2021-07-16T08:58:32,098][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2021-07-16T08:58:32,098][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2021-07-16T08:58:32,099][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_enabled = "auto"
[2021-07-16T08:58:32,100][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_pattern = "{now/d}-000001"
[2021-07-16T08:58:32,102][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_policy = "logstash-policy"
[2021-07-16T08:58:32,693][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2021-07-16T08:58:33,859][INFO ][org.reflections.Reflections] Reflections took 119 ms to scan 1 urls, producing 24 keys and 48 values 
[2021-07-16T08:58:35,151][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
[2021-07-16T08:58:35,157][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = false
[2021-07-16T08:58:35,161][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "52a930d67c6e8ae4917d8f70cc8be4a786b23c579dd073b9e52c1123807dde6d"
[2021-07-16T08:58:35,165][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [//192.168.186.157:9200]
[2021-07-16T08:58:35,169][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2021-07-16T08:58:35,172][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_85b7a1bb-8a41-4e85-97e8-0f8365c7e9b3", enable_metric=>true, charset=>"UTF-8">
[2021-07-16T08:58:35,172][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2021-07-16T08:58:35,173][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2021-07-16T08:58:35,173][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2021-07-16T08:58:35,173][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2021-07-16T08:58:35,174][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2021-07-16T08:58:35,174][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2021-07-16T08:58:35,174][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2021-07-16T08:58:35,174][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2021-07-16T08:58:35,175][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2021-07-16T08:58:35,175][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2021-07-16T08:58:35,175][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2021-07-16T08:58:35,181][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@custom_headers = {}
[2021-07-16T08:58:35,182][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2021-07-16T08:58:35,182][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2021-07-16T08:58:35,182][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_type = "logs"
[2021-07-16T08:58:35,182][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_dataset = "generic"
[2021-07-16T08:58:35,183][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_namespace = "default"
[2021-07-16T08:58:35,183][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_sync_fields = true
[2021-07-16T08:58:35,183][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_auto_routing = true
[2021-07-16T08:58:35,183][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2021-07-16T08:58:35,184][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2021-07-16T08:58:35,184][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2021-07-16T08:58:35,184][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2021-07-16T08:58:35,185][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2021-07-16T08:58:35,185][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2021-07-16T08:58:35,185][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2021-07-16T08:58:35,185][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2021-07-16T08:58:35,186][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2021-07-16T08:58:35,186][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2021-07-16T08:58:35,187][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2021-07-16T08:58:35,187][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2021-07-16T08:58:35,188][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_enabled = "auto"
[2021-07-16T08:58:35,188][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_pattern = "{now/d}-000001"
[2021-07-16T08:58:35,188][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_policy = "logstash-policy"
[2021-07-16T08:58:35,562][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//192.168.186.157:9200"]}
[2021-07-16T08:58:35,603][DEBUG][logstash.outputs.elasticsearch][main] Normalizing http path {:path=>nil, :normalized=>nil}
[2021-07-16T08:58:36,223][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://192.168.186.157:9200/]}}
[2021-07-16T08:58:36,242][DEBUG][logstash.outputs.elasticsearch][main] Running health check to see if an ES connection is working {:url=>"http://192.168.186.157:9200/", :path=>"/"}
[2021-07-16T08:58:36,508][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://192.168.186.157:9200/"}
[2021-07-16T08:58:36,593][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.13.1) {:es_version=>7}


[2021-07-16T08:58:36,598][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2021-07-16T08:58:37,543][INFO ][logstash.filters.geoip   ][main] Using geoip database {:path=>"/var/lib/logstash/plugins/filters/geoip/GeoLite2-City.mmdb", :healthy_database=>true}
[2021-07-16T08:58:37,773][ERROR][logstash.javapipeline    ][main] Pipeline error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{GREEDYMULTILINE:[system][syslog][message]} not defined>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:in `block in compile'", "org/jruby/RubyKernel.java:1442:in `loop'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in `compile'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.0/lib/logstash/filters/grok.rb:282:in `block in register'", "org/jruby/RubyArray.java:1809:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.0/lib/logstash/filters/grok.rb:276:in `block in register'", "org/jruby/RubyHash.java:1415:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.0/lib/logstash/filters/grok.rb:271:in `register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:228:in `block in register_plugins'", "org/jruby/RubyArray.java:1809:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:227:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:586:in `maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:240:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:185:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:137:in `block in start'"], "pipeline.sources"=>["/etc/logstash/conf.d/02-beats-input.conf", "/etc/logstash/conf.d/10-syslog-filter.conf", "/etc/logstash/conf.d/30-elasticsearch-output.conf"], :thread=>"#<Thread:0x30581195 run>"}
[2021-07-16T08:58:37,775][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2021-07-16T08:58:37,803][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2021-07-16T08:58:37,932][INFO ][logstash.runner          ] Logstash shut down.
[2021-07-16T08:58:37,947][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
	at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.16.0.jar:?]
	at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.16.0.jar:?]
	at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:89) ~[?:?]

And Kibana still empty

There is an exception in your grok pattern at this line:

[2021-07-16T08:58:37,773][ERROR][logstash.javapipeline    ][main] Pipeline error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{GREEDYMULTILINE:[system][syslog][message]} not defined>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:in `block in compile'", "org/jruby/RubyKernel.java:1442:in `loop'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in `compile'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.0/lib/logstash/filters/grok.rb:282:in `block in register'", "org/jruby/RubyArray.java:1809:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.0/lib/logstash/filters/grok.rb:276:in `block in register'", "org/jruby/RubyHash.java:1415:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.4.0/lib/logstash/filters/grok.rb:271:in `register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:228:in `block in register_plugins'", "org/jruby/RubyArray.java:1809:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:227:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:586:in `maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:240:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:185:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:137:in `block in start'"], "pipeline.sources"=>["/etc/logstash/conf.d/02-beats-input.conf", "/etc/logstash/conf.d/10-syslog-filter.conf", "/etc/logstash/conf.d/30-elasticsearch-output.conf"], :thread=>"#<Thread:0x30581195 run>"}

Oh, I didn't notice that. Thanks a lot!

When I add pattern_definitions => { "GREEDYMULTILINE"=> "(.|\n)*" } log seem like this

[2021-07-16T12:54:13,593][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2021-07-16T12:54:13,650][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.13.3", "jruby.version"=>"jruby 9.2.16.0 (2.5.7) 2021-03-03 f82228dc32 OpenJDK 64-Bit Server VM 11.0.11+9 on 11.0.11+9 +indy +jit [linux-x86_64]"}
[2021-07-16T12:54:19,836][INFO ][org.reflections.Reflections] Reflections took 149 ms to scan 1 urls, producing 24 keys and 48 values 
[2021-07-16T12:54:23,490][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
[2021-07-16T12:54:23,503][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = false
[2021-07-16T12:54:23,503][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4"
[2021-07-16T12:54:23,550][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [//192.168.186.157:9200]
[2021-07-16T12:54:23,560][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2021-07-16T12:54:23,565][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_3d04cf63-5850-4ffd-b92f-016dfe7153a9", enable_metric=>true, charset=>"UTF-8">
[2021-07-16T12:54:23,575][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2021-07-16T12:54:23,576][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2021-07-16T12:54:23,577][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2021-07-16T12:54:23,578][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2021-07-16T12:54:23,579][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2021-07-16T12:54:23,584][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2021-07-16T12:54:23,599][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2021-07-16T12:54:23,599][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2021-07-16T12:54:23,599][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2021-07-16T12:54:23,600][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2021-07-16T12:54:23,600][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2021-07-16T12:54:23,600][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@custom_headers = {}
[2021-07-16T12:54:23,600][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2021-07-16T12:54:23,604][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2021-07-16T12:54:23,606][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_type = "logs"
[2021-07-16T12:54:23,609][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_dataset = "generic"
[2021-07-16T12:54:23,610][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_namespace = "default"
[2021-07-16T12:54:23,611][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_sync_fields = true
[2021-07-16T12:54:23,616][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@data_stream_auto_routing = true
[2021-07-16T12:54:23,625][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2021-07-16T12:54:23,628][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2021-07-16T12:54:23,643][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2021-07-16T12:54:23,645][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2021-07-16T12:54:23,645][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2021-07-16T12:54:23,646][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2021-07-16T12:54:23,646][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2021-07-16T12:54:23,659][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2021-07-16T12:54:23,659][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2021-07-16T12:54:23,661][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2021-07-16T12:54:23,663][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2021-07-16T12:54:23,665][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2021-07-16T12:54:23,665][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_enabled = "auto"
[2021-07-16T12:54:23,666][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_pattern = "{now/d}-000001"
[2021-07-16T12:54:23,666][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_policy = "logstash-policy"
[2021-07-16T12:54:24,237][INFO ][logstash.runner          ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
[2021-07-16T12:54:27,748][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>1, :payload_size=>2341, :content_length=>2341, :batch_offset=>0}
[2021-07-16T12:54:27,990][WARN ][logstash.runner          ] SIGTERM received. Shutting down.
[2021-07-16T12:54:34,725][DEBUG][logstash.outputs.elasticsearch][main] Closing {:plugin=>"LogStash::Outputs::ElasticSearch"}
[2021-07-16T12:54:34,756][DEBUG][logstash.outputs.elasticsearch][main] Stopping sniffer
[2021-07-16T12:54:34,761][DEBUG][logstash.outputs.elasticsearch][main] Stopping resurrectionist
[2021-07-16T12:54:35,019][DEBUG][logstash.outputs.elasticsearch][main] Waiting for in use manticore connections
[2021-07-16T12:54:35,039][DEBUG][logstash.outputs.elasticsearch][main] Closing adapter #<LogStash::Outputs::ElasticSearch::HttpClient::ManticoreAdapter:0x72f27018>
[2021-07-16T12:54:35,061][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2021-07-16T12:54:35,705][INFO ][logstash.runner          ] Logstash shut down.[2021-07-16T12:55:59,938][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2021-07-16T12:55:59,981][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.13.3", "jruby.version"=>"jruby 9.2.16.0 (2.5.7) 2021-03-03 f82228dc32 OpenJDK 64-Bit Server VM 11.0.11+9 on 11.0.11+9 +indy +jit [linux-x86_64]"}
[2021-07-16T12:56:07,313][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}


Maybe it's working fine but my kibana still empty

1 Like

My 10-syslog-filter.conf now

filter {
  
    if [fileset][name] == "auth" {
      grok {
        match => { "message" => ["%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:\[%{POSINT:[system][auth][pid]}\])?: %{DATA:[system][auth][ssh][event]} %{DATA:[system][auth][ssh][method]} for (invalid user )?%{DATA:[system][auth][user]} from %{IPORHOST:[system][auth][ssh][ip]} port %{NUMBER:[system][auth][ssh][port]} ssh2(: %{GREEDYDATA:[system][auth][ssh][signature]})?",
               "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:\[%{POSINT:[system][auth][pid]}\])?: %{DATA:[system][auth][ssh][event]} user %{DATA:[system][auth][user]} from %{IPORHOST:[system][auth][ssh][ip]}",
               "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sshd(?:\[%{POSINT:[system][auth][pid]}\])?: Did not receive identification string from %{IPORHOST:[system][auth][ssh][dropped_ip]}",
               "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} sudo(?:\[%{POSINT:[system][auth][pid]}\])?: \s*%{DATA:[system][auth][user]} :( %{DATA:[system][auth][sudo][error]} ;)? TTY=%{DATA:[system][auth][sudo][tty]} ; PWD=%{DATA:[system][auth][sudo][pwd]} ; USER=%{DATA:[system][auth][sudo][user]} ; COMMAND=%{GREEDYDATA:[system][auth][sudo][command]}",
               "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} groupadd(?:\[%{POSINT:[system][auth][pid]}\])?: new group: name=%{DATA:system.auth.groupadd.name}, GID=%{NUMBER:system.auth.groupadd.gid}",
               "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} useradd(?:\[%{POSINT:[system][auth][pid]}\])?: new user: name=%{DATA:[system][auth][user][add][name]}, UID=%{NUMBER:[system][auth][user][add][uid]}, GID=%{NUMBER:[system][auth][user][add][gid]}, home=%{DATA:[system][auth][user][add][home]}, shell=%{DATA:[system][auth][user][add][shell]}$",
               "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} %{DATA:[system][auth][program]}(?:\[%{POSINT:[system][auth][pid]}\])?: %{GREEDYMULTILINE:[system][auth][message]}"] }

      	pattern_definitions => {
        "GREEDYMULTILINE"=> "(.|\n)*"
      	}

        add_field => [ "activity", "SSH Login" ]
	add_tag => ["linux_auth"]
        remove_field => "message"
      }
      date {
        match => [ "[system][auth][timestamp]", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
      }
      geoip {
        source => "[system][auth][ssh][ip]"
        target => "[system][auth][ssh][geoip]"
      }
    }
    else if [fileset][name] == "syslog" {
      grok {
        match => { "message" => ["%{SYSLOGTIMESTAMP:[system][syslog][timestamp]} %{SYSLOGHOST:[system][syslog][hostname]} %{DATA:[system][syslog][program]}(?:\[%{POSINT:[system][syslog][pid]}\])?: %{GREEDYMULTILINE:[system][syslog][message]}"] }

	pattern_definitions => {
        "GREEDYMULTILINE"=> "(.|\n)*"
      	}

        remove_field => "message"
      }
      date {
        match => [ "[system][syslog][timestamp]", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
      }
    }
  
}

And /var/log/logstash/logstash-plain.log like this

[2021-07-16T12:56:18,490][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//192.168.186.157:9200"]}
[2021-07-16T12:56:18,572][DEBUG][logstash.outputs.elasticsearch][main] Normalizing http path {:path=>nil, :normalized=>nil}
[2021-07-16T12:56:19,712][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://192.168.186.157:9200/]}}
[2021-07-16T12:56:19,745][DEBUG][logstash.outputs.elasticsearch][main] Running health check to see if an ES connection is working {:url=>"http://192.168.186.157:9200/", :path=>"/"}
[2021-07-16T12:56:20,299][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://192.168.186.157:9200/"}
[2021-07-16T12:56:20,565][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.13.3) {:es_version=>7}
[2021-07-16T12:56:20,578][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2021-07-16T12:56:22,984][INFO ][logstash.filters.geoip   ][main] Using geoip database {:path=>"/var/lib/logstash/plugins/filters/geoip/GeoLite2-City.mmdb", :healthy_database=>true}
[2021-07-16T12:56:24,740][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, "pipeline.sources"=>["/etc/logstash/conf.d/02-beats-input.conf", "/etc/logstash/conf.d/10-syslog-filter.conf", "/etc/logstash/conf.d/30-elasticsearch-output.conf"], :thread=>"#<Thread:0x6ab706b1 run>"}
[2021-07-16T12:56:28,823][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>4.05}
[2021-07-16T12:56:28,974][INFO ][logstash.inputs.beats    ][main] Starting input listener {:address=>"0.0.0.0:5044"}
[2021-07-16T12:56:29,060][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2021-07-16T12:56:29,553][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2021-07-16T12:56:29,809][INFO ][org.logstash.beats.Server][main][00145a0cb8463ca43ef2223569f855e834ec37f6ef6bfc2625aa94a70ea1a123] Starting server on port: 5044
[2021-07-16T12:57:01,912][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>3, :payload_size=>4758, :content_length=>4758, :batch_offset=>0}
[2021-07-16T12:57:06,992][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>125, :payload_size=>151926, :content_length=>151926, :batch_offset=>0}
[2021-07-16T12:57:20,235][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>96, :payload_size=>119014, :content_length=>119014, :batch_offset=>0}
[2021-07-16T12:57:23,811][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>1, :payload_size=>2402, :content_length=>2402, :batch_offset=>0}
[2021-07-16T12:57:40,917][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>51, :payload_size=>62254, :content_length=>62254, :batch_offset=>0}
[2021-07-16T12:57:42,369][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>16, :payload_size=>19360, :content_length=>19360, :batch_offset=>0}
[2021-07-16T12:57:44,637][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>19, :payload_size=>22994, :content_length=>22994, :batch_offset=>0}
[2021-07-16T12:57:45,995][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>4, :payload_size=>4865, :content_length=>4865, :batch_offset=>0}
[2021-07-16T12:57:46,676][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>11, :payload_size=>13277, :content_length=>13277, :batch_offset=>0}
[2021-07-16T12:57:47,974][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>3, :payload_size=>3630, :content_length=>3630, :batch_offset=>0}
[2021-07-16T12:57:48,762][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>42, :payload_size=>50825, :content_length=>50825, :batch_offset=>0}
[2021-07-16T12:57:53,594][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>1, :payload_size=>1177, :content_length=>1177, :batch_offset=>0}
[2021-07-16T12:57:56,659][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>1, :payload_size=>2385, :content_length=>2385, :batch_offset=>0}
[2021-07-16T12:58:21,652][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>1, :payload_size=>2386, :content_length=>2386, :batch_offset=>0}
[2021-07-16T12:58:56,647][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>1, :payload_size=>2360, :content_length=>2360, :batch_offset=>0}
[2021-07-16T12:59:21,663][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>1, :payload_size=>2290, :content_length=>2290, :batch_offset=>0}
[2021-07-16T12:59:56,673][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>1, :payload_size=>2334, :content_length=>2334, :batch_offset=>0}
[2021-07-16T13:00:24,123][DEBUG][logstash.outputs.elasticsearch][main][2cc5735bb184863babd5a9a2142f700a58d17b0fb2eb5a9b26fb8d8baea966b4] Sending final bulk request for batch. {:action_count=>1, :payload_size=>2264, :content_length=>2264, :batch_offset=>0}

Check if the index has been created using the API or in index management in Kibana. To start discovering logs, you need to create an index pattern in Kibana.

I created index pattern in Kibana

before this mess. When I didn't add filter it's received log daily, but now it's not anymore.

When I delete filter, it still receive log.
But it not when I added filter again.

It's empty.

I suggest you to check all your grok patterns in Kibana's Dev Tools and grokdebugger one more time. Then go for the logs themselves on your hosts and check if there are that much log. Also test your filebeat configurations on your hosts using (to make sure you have not changed anything):

filebeat test config

as well.

You can test the output and check if it's like how you think it should be by using elasticsearch output. Also try removing stdout in 30-elasticsearch-output.conf
Try to make @timestamp smaller on Kibana to see more details on logs.

You can run Logstash as a service instead of the command you mentioned:

service logstash restart

By default, it reads anything in its conf.d and executes them in order.

I tried grokdebugger in Dev Tools, seems like some fields work fine and some can't filter and get error.

How can I find GREEDYMULTILINE, I thought it's in Grok's pattern dictionary but it's not. Maybe it's cause error

About filebeat

image

There're some log from auth.log in my filebeat

1 Like

Your grok pattern is wrong and thus, logstash can not parse it. Could you provide this log and its pattern that gave you the error?

I think GREEDYMULTILINE causes the error 'cause it's not in Grok's pattern.
About the rest, some of it said Provided Grok patterns do not match data in the input, but some still filter normally.

1 Like

It's better to copy them here. I can not test it :slight_smile:

Here some log from auth.log

https://newtextdocument.com/4067e15cee

This line makes an error for me

               "%{SYSLOGTIMESTAMP:[system][auth][timestamp]} %{SYSLOGHOST:[system][auth][hostname]} %{DATA:[system][auth][program]}(?:\[%{POSINT:[system][auth][pid]}\])?: %{GREEDYMULTILINE:[system][auth][message]}"] }

1 Like

Well, apart from your Grok pattern (GREEDYMULTILINE) which I think is a custom pattern in somewhere else, there's this exception in your log that says the path that you have provided is a directory and not a file:

[2019-10-15T05:19:35,351][FATAL][logstash.runner          ] An unexpected error occurred! {:error=>java.lang.IllegalStateException: org.jruby.exceptions.SystemCallError: (EISDIR) Is a directory - /usr/share/logstash/data/filter-hashtree, :backtrace=>["org.logstash.execution.WorkerLoop.run(org/logstash/execution/WorkerLoop.java:85)", "jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)", "jdk.internal.reflect.NativeMethodAccessorImpl.invoke(jdk/internal/reflect/NativeMethodAccessorImpl.java:62)", "jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(jdk/internal/reflect/DelegatingMethodAccessorImpl.java:43)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:566)", "org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:440)", "org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:304)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:239)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:295)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:274)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:270)", "java.lang.Thread.run(java/lang/Thread.java:834)"]}

it says:

org.jruby.exceptions.SystemCallError: (EISDIR) Is a directory - /usr/share/logstash/data/filter-hashtree

About the custom pattern, if it is, its path should be provided in your filters like:

filter {
      grok {
        patterns_dir => ["PATH_TO_PATTERNS_DIRECTORY"]
        match => ...
      }
    }

I find GREEDYMULTILINE in Configuration Examples | Logstash Reference [5.4] | Elastic , so I thought it's not a custom pattern, but now I know it not and still can't find it. Maybe I'll delete that line.

About the exception you said, I didn't see in my log (I searched for that), where did you find that? :thinking:

The only thing I find with FATAL is

[2021-07-16T08:59:25,728][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
	at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.16.0.jar:?]
	at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.16.0.jar:?]
	at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:89) ~[?:?]