Logstash grok pattern failed check config

Dear ALL,

i try to create ssh grok pattern this is my pattern

input {
  beats {
    port => 5044
	host => "192.168.11.13"
  }
}
# Capture_all_MSG
filter {
	if [fileset][name] == "auth" {
	grok {
	match => { "message" => [ "%{SYSLOGTIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:system.auth.hostname} sshd(?:\[%{NUMBER:system.auth.pid}\])?: %{GREEDYDATA:msgsplit}",
				"%{SYSLOGTIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:system.auth.hostname} %{WORD:system.auth.su}: %{GREEDYDATA:msgsplit}"] }
		}
# SSH Listening Service
	if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
	grok {
	match => { "msgsplit", [ "%{WORD:service.ssh.hostname} %{WORD:service.ssh.event} on %{IPV4:service.listen.ip} port %{IPORHOST:service.listen.port}." ] }
	add_tag => [ "ssh_listening_service", "grokked" ]
	tag_on_failure => [ ]
	}
}
# SSH Terminating Service
	if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
	grok {
	match => { "msgsplit", "Received signal %{NUMBER:service.ssh.signal}; %{WORD:service.ssh.event}." ] }
	add_tag => [ "ssh_terminating_service", "grokked" ]
	tag_on_failure => [ ]
	}
}
# SSH Accepted
	if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
	grok {
	match => { "msgsplit", "%{WORD:system.auth.event} %{WORD:system.auth.method} for %{USER:system.auth.user} from %{IPV4:system.auth.srcip} port %{BASE10NUM:system.auth.port} %{PROG:system.auth.program}",
			   "pam_unix\(sshd:session\): %{WORD:system.auth.method} %{WORD:system.auth.event} for user %{USERNAME:system.auth.user} by \(uid=%{NUMBER:system.auth.uid}\)" ] }
	add_tag => [ "ssh_accepted", "grokked" ]
	tag_on_failure => [ ]
	}
}
# SSH Failed
	if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
	grok {
	match => { "msgsplit", [ "%{WORD:system.auth.event} %{WORD:system.auth.method} for %{USER:system.auth.user} from %{IPV4:system.auth.srcip} port %{BASE10NUM:system.auth.port} %{PROG:system.auth.program}",
				"pam_unix\(sshd:auth\): %{WORD:system.auth.method} %{WORD:system.auth.event}; logname= uid=%{NUMBER:system.auth.uid} euid=%{NUMBER:system.auth.euid} tty=%{PROG:system.auth.tty} ruser= rhost=%{IPV4:system.auth.rhost}\s+user=%{USERNAME:system.auth.user}",
				"error: maximum %{WORD:system.auth.method} attempts %{WORD:system.auth.event} for %{USERNAME:system.auth.user} from %{IPV4:system.auth.srcip} port %{NUMBER:system.auth.port} %{PROG:system.auth.program} (?:\[%{WORD:system.auth.signature}\])?" ] }
	add_tag => [ "ssh_failed", "grokked" ]
	tag_on_failure => [ ]
	}
}
# SSH Session Open
	if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
	grok {
	match => { "msgsplit", [ "pam_unix\(sshd:session\): %{DATA:system.auth.method} %{WORD:system.auth.event} for user %{USERNAME:system.auth.user} by \(uid=%{NUMBER:system.auth.uid}\)" ] }
	add_tag => [ "ssh_session_open", "grokked" ]
	tag_on_failure => [ ]
	}
}
# SSH Session Closed
	if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
	grok {
	match => { "msgsplit", [ "pam_unix\(sshd:session\): %{DATA:system.auth.method} %{WORD:system.auth.event} for user %{USERNAME:system.auth.user}" ] }
	add_tag => [ "ssh_session_closed", "grokked" ]
	tag_on_failure => [ ]
	}
}
# SSH Session Open by SU
	if "grokked" not in [tags] and "su" == [system.auth.su] {
	grok {
	match => { "msgsplit", [ "pam_unix\(su-l:session\): %{WORD:system.auth.method} %{WORD:system.auth.event} for user %{USERNAME:system.auth.user} by %{USERNAME:system.auth.by}\(uid=%{NUMBER:system.auth.uid}\)" ] }
	add_tag => [ "ssh_session_open_su", "grokked" ]
	tag_on_failure => [ ]
	}
}
# SSH Session Closed by SU
	if "grokked" not in [tags] and "su" == [system.auth.su] {
	grok {
	match => { "msgsplit", [ "pam_unix\(su-l:session\): %{WORD:system.auth.method} %{WORD:system.auth.event} for user %{USERNAME:system.auth.user}" ] }
	add_tag => [ "ssh_session_closed_su", "grokked" ]
	tag_on_failure => [ ]
	}
}
}
}
output {
   elasticsearch {
     hosts => ["localhost:9200"]
     manage_template => false
     index => "ssh_auth-%{+YYYY.MM}"
}
}

i got this message
after checking my pattern now getting this message

grok - Invalid setting for grok filter plugin:

  filter {
    grok {
      # This setting must be a hash
      # This field must contain an even number of items, got 3
      match => ["msgsplit", "%{WORD:system.auth.event} %{WORD:system.auth.method} for %{USER:system.auth.user} from %{IPV4:system.auth.srcip} port %{BASE10NUM:system.auth.port} %{PROG:system.auth.program}", "pam_unix\\(sshd:session\\): %{WORD:system.auth.method} %{WORD:system.auth.event} for user %{USERNAME:system.auth.user} by \\(uid=%{NUMBER:system.auth.uid}\\)"]
      ...
    }
  }

any help for this or there is better idea for create grok pattern for that message ssh.
using logstash 7.2

Thanks and Regards

That should be

match => { "msgsplit" => [ "%{WORD:system.auth.event} %{WORD:system.auth.method} for %{USER:system.auth.user} from %{IPV4:system.auth.srcip} port %{BASE10NUM:system.auth.port} %{PROG:system.auth.program}", "pam_unix\\(sshd:session\\): %{WORD:system.auth.method} %{WORD:system.auth.event} for user %{USERNAME:system.auth.user} by \\(uid=%{NUMBER:system.auth.uid}\\)"] }

Hi @Badger,

already changing the line and check everything still facing issue like this

 runner - The given configuration is invalid. Reason: Expected one of #, => at line 17, column 23 (byte 556) after filter {
        if [fileset][name] == "auth" {
        grok {
        match => { "message" => [ "%{SYSLOGTIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:system.auth.hostname} sshd(?:\[%{NUMBER:system.auth.pid}\])?: %{GREEDYDATA:msgsplit}",
                                "%{SYSLOGTIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:system.auth.hostname} %{WORD:system.auth.su}: %{GREEDYDATA:msgsplit}" ] }
                }
# SSH Listening Service
        if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
        grok {
        match => { "msgsplit"

already check line 17 what i miss....?

this is line 17

# SSH Listening Service
	if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
	grok {
	match => { "msgsplit", [ "%{WORD:service.ssh.hostname} %{WORD:service.ssh.event} on %{IPV4:service.listen.ip} port %{IPORHOST:service.listen.port}." ] }
	add_tag => [ "ssh_listening_service", "grokked" ]
	tag_on_failure => [ ]
	}
}

need advice and thanks for reply.

That should be

match => { "msgsplit" => [

@Badger

Thanks its working now, but why its only getting data from 11 July because i have data from 8 July and i only can see from 11 July in kibana already set date in kibana.

something mistake i thing maybe you can help.

Thanks again.

If you are saying that the logs are from July 8th but @timestamp is from today then to fix that you would use a date filter to parse system.auth.timestamp

@Badger

already set date match filter but still showing 11 July, restart logstash and filebeat not helping any advice...?

log format
Jul 8 15:31:40

and i use
MMM dd HH:mm:ss

not working

Thanks

If your log format is

Jul 8 15:31:40 

then that date pattern would match. However, if it is

Jul  8 15:31:40

then you need to use

date { match => [ "message", "MMM dd HH:mm:ss", "MMM  d HH:mm:ss" ] }

@Badger

still not working use "message" and use "system.auth.timestamp" also not working.

thanks

still figure out pattern and configuration still the same.
maybe some one can give me advice.

thanks

any help in secure log date from 8 July but i only show from kibana date now 12 July.
what i miss ..?

using default configuration filebeat
system module config
1 # Module: system
2 # Docs: https://www.elastic.co/guide/en/beats/filebeat/7.2/filebeat-module-system.html
3
4 - module: system
5 # Syslog
6 syslog:
7 enabled: false
8
9 # Set custom paths for the log files. If left empty,
10 # Filebeat will choose the paths depending on your OS.
11 #var.paths:
12
13 # Convert the timestamp to UTC. Requires Elasticsearch >= 6.1.
14 #var.convert_timezone: true
15
16 # Authorization logs
17 auth:
18 enabled: true
19
20 # Set custom paths for the log files. If left empty,
21 # Filebeat will choose the paths depending on your OS.
22 #var.paths: ["/var/log/secure"]
23
24 # Convert the timestamp to UTC. Requires Elasticsearch >= 6.1.
25 #var.convert_timezone: true

Thanks.

Yeah, where I wrote "message" I meant "system.auth.timestamp".

@Badger

try to simplify grok pattern

 input {
  2   beats {
  3     port => 5044
  4   }
  5 }
  6 filter {
  7    grok {
  8         match => { "message" => ["%{GREEDYDATA:allmsg}"] }}
  9         date { match => [ "message", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ] }
 10 }
 11 output {
 12    elasticsearch {
 13      hosts => ["localhost:9200"]
 14      manage_template => false
 15      index => "ssh_auth-%{+YYYY.MM.dd}"
 16 }
 17 }

still the same i dont know what going on.

i use module system for this is there any configuration i miss because i cannot get data from old date..

this is my system date
image

need advice please.

is there any one can help me fix this issue i don't know how to fix this search everywhere try everything not work until now.

Regards.

The pattern in a date filter has to exactly match the entire field. So matching against [message] is not going to work. As I said, I intended that to be system.auth.timestamp rather than message.

Hi Badger,

Still not working i simplify logstash configuration

      1 input {
      2   beats {
      3     port => 5044
      4   }
      5 }
      6 filter {
      7    grok {
      8         match => { "message" => [ "%{SYSLOGTIMESTAMP:timestamp} %{GREEDYDATA:allmsg}" ] }}
      9         date {
     10       match => [ "timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
     11         }
     12 }
     13 output {
     14    elasticsearch {
     15      hosts => ["localhost:9200"]
     16      manage_template => false
     17      index => "ssh_auth-%{+YYYY.MM}"
     18 }
     19 }

any idea why this still not working until now..?

Thanks

What does your message field look like?

Hi Badger,

like this

That looks like the time got parsed correctly. What is the problem?