i try to create ssh grok pattern this is my pattern
input {
beats {
port => 5044
host => "192.168.11.13"
}
}
# Capture_all_MSG
filter {
if [fileset][name] == "auth" {
grok {
match => { "message" => [ "%{SYSLOGTIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:system.auth.hostname} sshd(?:\[%{NUMBER:system.auth.pid}\])?: %{GREEDYDATA:msgsplit}",
"%{SYSLOGTIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:system.auth.hostname} %{WORD:system.auth.su}: %{GREEDYDATA:msgsplit}"] }
}
# SSH Listening Service
if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
grok {
match => { "msgsplit", [ "%{WORD:service.ssh.hostname} %{WORD:service.ssh.event} on %{IPV4:service.listen.ip} port %{IPORHOST:service.listen.port}." ] }
add_tag => [ "ssh_listening_service", "grokked" ]
tag_on_failure => [ ]
}
}
# SSH Terminating Service
if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
grok {
match => { "msgsplit", "Received signal %{NUMBER:service.ssh.signal}; %{WORD:service.ssh.event}." ] }
add_tag => [ "ssh_terminating_service", "grokked" ]
tag_on_failure => [ ]
}
}
# SSH Accepted
if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
grok {
match => { "msgsplit", "%{WORD:system.auth.event} %{WORD:system.auth.method} for %{USER:system.auth.user} from %{IPV4:system.auth.srcip} port %{BASE10NUM:system.auth.port} %{PROG:system.auth.program}",
"pam_unix\(sshd:session\): %{WORD:system.auth.method} %{WORD:system.auth.event} for user %{USERNAME:system.auth.user} by \(uid=%{NUMBER:system.auth.uid}\)" ] }
add_tag => [ "ssh_accepted", "grokked" ]
tag_on_failure => [ ]
}
}
# SSH Failed
if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
grok {
match => { "msgsplit", [ "%{WORD:system.auth.event} %{WORD:system.auth.method} for %{USER:system.auth.user} from %{IPV4:system.auth.srcip} port %{BASE10NUM:system.auth.port} %{PROG:system.auth.program}",
"pam_unix\(sshd:auth\): %{WORD:system.auth.method} %{WORD:system.auth.event}; logname= uid=%{NUMBER:system.auth.uid} euid=%{NUMBER:system.auth.euid} tty=%{PROG:system.auth.tty} ruser= rhost=%{IPV4:system.auth.rhost}\s+user=%{USERNAME:system.auth.user}",
"error: maximum %{WORD:system.auth.method} attempts %{WORD:system.auth.event} for %{USERNAME:system.auth.user} from %{IPV4:system.auth.srcip} port %{NUMBER:system.auth.port} %{PROG:system.auth.program} (?:\[%{WORD:system.auth.signature}\])?" ] }
add_tag => [ "ssh_failed", "grokked" ]
tag_on_failure => [ ]
}
}
# SSH Session Open
if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
grok {
match => { "msgsplit", [ "pam_unix\(sshd:session\): %{DATA:system.auth.method} %{WORD:system.auth.event} for user %{USERNAME:system.auth.user} by \(uid=%{NUMBER:system.auth.uid}\)" ] }
add_tag => [ "ssh_session_open", "grokked" ]
tag_on_failure => [ ]
}
}
# SSH Session Closed
if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
grok {
match => { "msgsplit", [ "pam_unix\(sshd:session\): %{DATA:system.auth.method} %{WORD:system.auth.event} for user %{USERNAME:system.auth.user}" ] }
add_tag => [ "ssh_session_closed", "grokked" ]
tag_on_failure => [ ]
}
}
# SSH Session Open by SU
if "grokked" not in [tags] and "su" == [system.auth.su] {
grok {
match => { "msgsplit", [ "pam_unix\(su-l:session\): %{WORD:system.auth.method} %{WORD:system.auth.event} for user %{USERNAME:system.auth.user} by %{USERNAME:system.auth.by}\(uid=%{NUMBER:system.auth.uid}\)" ] }
add_tag => [ "ssh_session_open_su", "grokked" ]
tag_on_failure => [ ]
}
}
# SSH Session Closed by SU
if "grokked" not in [tags] and "su" == [system.auth.su] {
grok {
match => { "msgsplit", [ "pam_unix\(su-l:session\): %{WORD:system.auth.method} %{WORD:system.auth.event} for user %{USERNAME:system.auth.user}" ] }
add_tag => [ "ssh_session_closed_su", "grokked" ]
tag_on_failure => [ ]
}
}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "ssh_auth-%{+YYYY.MM}"
}
}
i got this message
after checking my pattern now getting this message
grok - Invalid setting for grok filter plugin:
filter {
grok {
# This setting must be a hash
# This field must contain an even number of items, got 3
match => ["msgsplit", "%{WORD:system.auth.event} %{WORD:system.auth.method} for %{USER:system.auth.user} from %{IPV4:system.auth.srcip} port %{BASE10NUM:system.auth.port} %{PROG:system.auth.program}", "pam_unix\\(sshd:session\\): %{WORD:system.auth.method} %{WORD:system.auth.event} for user %{USERNAME:system.auth.user} by \\(uid=%{NUMBER:system.auth.uid}\\)"]
...
}
}
any help for this or there is better idea for create grok pattern for that message ssh.
using logstash 7.2
match => { "msgsplit" => [ "%{WORD:system.auth.event} %{WORD:system.auth.method} for %{USER:system.auth.user} from %{IPV4:system.auth.srcip} port %{BASE10NUM:system.auth.port} %{PROG:system.auth.program}", "pam_unix\\(sshd:session\\): %{WORD:system.auth.method} %{WORD:system.auth.event} for user %{USERNAME:system.auth.user} by \\(uid=%{NUMBER:system.auth.uid}\\)"] }
already changing the line and check everything still facing issue like this
runner - The given configuration is invalid. Reason: Expected one of #, => at line 17, column 23 (byte 556) after filter {
if [fileset][name] == "auth" {
grok {
match => { "message" => [ "%{SYSLOGTIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:system.auth.hostname} sshd(?:\[%{NUMBER:system.auth.pid}\])?: %{GREEDYDATA:msgsplit}",
"%{SYSLOGTIMESTAMP:system.auth.timestamp} %{SYSLOGHOST:system.auth.hostname} %{WORD:system.auth.su}: %{GREEDYDATA:msgsplit}" ] }
}
# SSH Listening Service
if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
grok {
match => { "msgsplit"
already check line 17 what i miss....?
this is line 17
# SSH Listening Service
if "grokked" not in [tags] and "sshd" == [system.auth.pid] {
grok {
match => { "msgsplit", [ "%{WORD:service.ssh.hostname} %{WORD:service.ssh.event} on %{IPV4:service.listen.ip} port %{IPORHOST:service.listen.port}." ] }
add_tag => [ "ssh_listening_service", "grokked" ]
tag_on_failure => [ ]
}
}
Thanks its working now, but why its only getting data from 11 July because i have data from 8 July and i only can see from 11 July in kibana already set date in kibana.
If you are saying that the logs are from July 8th but @timestamp is from today then to fix that you would use a date filter to parse system.auth.timestamp
any help in secure log date from 8 July but i only show from kibana date now 12 July.
what i miss ..?
using default configuration filebeat
system module config
1 # Module: system
2 # Docs: https://www.elastic.co/guide/en/beats/filebeat/7.2/filebeat-module-system.html
3
4 - module: system
5 # Syslog
6 syslog:
7 enabled: false
8
9 # Set custom paths for the log files. If left empty,
10 # Filebeat will choose the paths depending on your OS.
11 #var.paths:
12
13 # Convert the timestamp to UTC. Requires Elasticsearch >= 6.1.
14 #var.convert_timezone: true
15
16 # Authorization logs
17 auth:
18 enabled: true
19
20 # Set custom paths for the log files. If left empty,
21 # Filebeat will choose the paths depending on your OS.
22 #var.paths: ["/var/log/secure"]
23
24 # Convert the timestamp to UTC. Requires Elasticsearch >= 6.1.
25 #var.convert_timezone: true
The pattern in a date filter has to exactly match the entire field. So matching against [message] is not going to work. As I said, I intended that to be system.auth.timestamp rather than message.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.