I want to create a grok for the log below.
"2016-03-30 19:14:06-0500 [HonsshServerTransport,13657,195.154.43.29] [PLUG IN][EXAMPLE] - {'honey_port': '22', 'sensor_name': 'node3', 'session': {'auths': [{'username': 'admin', 'date_time': '20160330_191406_202662', 'spoofed': False, 'password': 'default', 'succe ss': False}], 'country': 'France', 'start_time': '20160330_191402_828898', 'log_location': 'ses sions/node3/195.154.43.29/', 'session_id': 'defe2632f71c405f827e693c8da6577b', 'peer_port': '56 043', 'channels': [], 'version': 'SSH-2.0-JSCH-0.1.51', 'end_time': '20160330_191406_388121', ' peer_ip': '195.154.43.29'}, 'honey_ip': '192.168.1.8'}"
The easiest way is probably to parse the part above using grok and capture the rest of the message in a single field, as the rest look similar to JSON. It does however not seem to be strictly valid JSON, so you may need to run a mutate gsub filter on it and potentially replace '
with "
and False
with false
. You should then be able to parse that field using the json filter.
Thanks Christian,
I added this lines in my filter and the logstash configuration was ok but yet I got this error -> Invalid argument - Invalid file
mutate{
gsub => ["message", "'", '"',
"message", "False", "false"]
}
json {
source => "message"
}
Below is my logstash.conf, the configuration runs ok but yet I got this error -> Invalid argument - Invalid file
input {
file {
path => "/opt/honssh/logs/*"
type => "honsshdaily"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
if [type] == "honsshdaily" {
grok {
match => { "message" => "%{NUMBER}%{NUMBER}%{NUMBER},%{IPV4:attack_src},%{USERNAME:username},%{WORD:password},%{NONNEGINT:success_code}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
geoip {
source => "attack_src"
target => "geoip"
database => "/opt/logstash/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
gsub => ["message", "'", '"',
"message", "False", "false"]
}
json {
source => "message"
}
}
}
output{
elasticsearch {hosts => "127.0.0.1:9200"}
stdout {codec => rubydebug }
}
Try separating out the fields before the start of the JSON like structure, the mutate the last field into JSON and apply the son filter, possibly something like this example:
filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} \[%{WORD:text},%{NUMBER:num},%{IPV4:attack_src}\]\s*\[%{GREEDYDATA:username}\]\[%{WORD:password}\] -\s*%{GREEDYDATA:rest}" }
}
mutate {
gsub => ["rest", "'", '"']
gsub => ["rest", "False", "false"]
}
json {
source => "rest"
}
mutate {
remove_field => ["rest", "message"]
}
}