Hello,
Since a few days, my ELK+Fluentd stack stopped working as usual and I cannot find the source of my problem. I'm using :
- Logstash 6.5.3-1
- Filebeat version 6.4.0 (amd64), libbeat 6.4.0
- Elasticsearch 6.5
My logs are in bad shape in Kibana :
. Everything is JSON like. I don't remember when it changed but before, they were parsed correctly. Now, I cannot query them. You can easily understand the usefulness of a such system
Here are my configs :
Filebeat :
- type: log
paths:
- /var/log/auth.log
- /var/log/faillog
- /var/log/secure
exclude_files: [".gz$"]
tags: ["auth"]
fields_under_root: true
processors:
- add_locale: ~
Logstash :
input {
beats {
port => "5044"
}
}
filter {
if "auth" in [tags] {
grok {
add_field => { "foo" => "bar" }
match => {
"message" => [
DELETED_BECAUSE_MESSAGE_WOULD_BE_TOO_LONG
]
}
pattern_definitions => {
"GREEDYMULTILINE"=> "(.|\n)*"
}
# remove_field => "message"
}
date {
match => [ "[system][auth][timestamp]", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
geoip {
source => "[system][auth][ssh][ip]"
target => "[system][auth][ssh][geoip]"
}
json {
source => "system"
}
}
}
output {
if "auth" in [tags] {
stdout { }
elasticsearch {
hosts => "http://elasticsearch.example.lan:9200"
index => "logs-auth-%{+YYYY.MM.dd}"
ssl_certificate_verification => "false"
template_overwrite => true
template => "/etc/logstash/template.json"
manage_template => "true"
}
}
}
As you can see, the filter is working well because the "foo" field is added and the geoip is done. Obviously, grok isn't working because it can't find the message.
The logstash logs are showing 2 things. First, I cannot force-parse some element (like system or beat) because it's not JSON :
Error parsing json {:source=>"system", :raw=>{"auth"=>{"hostname"=>"REDACTED", "ssh"=>{"ip"=>"REDACTED", "geoip"=>{"country_code2"=>"SG", "region_code"=>"01", "latitude"=>REDACTED, "continent_code"=>"AS", "ip"=>"REDACTED", "longitude"=>REDACTED, "country_name"=>"Singapore", "country_code3"=>"SG", "city_name"=>"Singapore", "region_name"=>"Central Singapore Community Development Council", "timezone"=>"Asia/Singapore", "location"=>{"lon"=>REDACTED, "lat"=>REDACTED}}}, "timestamp"=>"Dec 14 12:55:02", "pid"=>"20333"}}, :exception=>java.lang.ClassCastException: org.jruby.RubyHash cannot be cast to org.jruby.RubyIO}
And second, the log is sent properly by filebeat :
{
"beat" => {
"hostname" => "REDACTED",
"timezone" => "+01:00",
"version" => "6.5.3",
"name" => "REDACTED"
},
"message" => "Dec 14 12:55:02 REDACTED sshd[20333]: Disconnected from REDACTED port 48204 [preauth]",
"input" => {
"type" => "log"
},
"foo" => "bar",
"source" => "/var/log/auth.log",
"host" => {
"name" => "REDACTED"
},
"fields" => {
"cluster" => "dmz",
"env" => "int",
"region" => "REDACTED"
},
"offset" => 1031460,
"@version" => "1",
"prospector" => {
"type" => "log"
},
"@timestamp" => 2018-12-14T11:55:02.000Z,
"system" => {
"auth" => {
"hostname" => "REDACTED",
"ssh" => {
"ip" => "REDACTED",
"geoip" => {
"country_code2" => "SG",
"region_code" => "01",
"latitude" => REDACTED,
"continent_code" => "AS",
"ip" => "REDACTED",
"longitude" => REDACTED,
"country_name" => "Singapore",
"country_code3" => "SG",
"city_name" => "Singapore",
"region_name" => "Central Singapore Community Development Council",
"timezone" => "Asia/Singapore",
"location" => {
"lon" => REDACTED,
"lat" => REDACTED
}
}
},
"timestamp" => "Dec 14 12:55:02",
"pid" => "20333"
}
},
"tags" => [
[0] "auth",
[1] "beats_input_codec_plain_applied",
[2] "_jsonparsefailure"
]
}
So I cannot understand why the objects are not parsed into ES.
