Log flooded with error messages

Hi,

I am trying to setup the ELK-Stack using this - https://pawelurbanek.com/elk-nginx-logs-setup - Blog article here.

Filebeat is running and sending data to logstash.
My logstash configuration looks like this (I removed the SSL part for testing)

input {
beats {
port => 5400
}
}

filter {
grok {
match => [ "message" , "%{COMBINEDAPACHELOG}+%{GREEDYDATA:extra_fields}"]
overwrite => [ "message" ]
}
mutate {
convert => ["response", "integer"]
convert => ["bytes", "integer"]
convert => ["responsetime", "float"]
}
geoip {
source => "clientip"
target => "geoip"
add_tag => [ "nginx-geoip" ]
}
date {
match => [ "timestamp" , "dd/MMM/YYYY:HH:mm:ss Z" ]
remove_field => [ "timestamp" ]
}
useragent {
source => "agent"
}
}

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "weblogs-%{+YYYY.MM.dd}"
document_type => "nginx_logs"
}
stdout { codec => rubydebug }

Elasticsearch is running and accepting input on port 9200. But when I try to discover the data in Kibana I dont find the configured pattern.

/var/log/logstash/logstash-plain.log is full of these error which I can't interprete.

[2020-02-25T17:43:05,842][ERROR][logstash.filters.useragent][main] Uknown error while parsing user agent data {:exception=>#<TypeError: cannot convert instance of class org.jruby.RubyHash to class java.lang.String>, :field=>"agent", :event=>#LogStash::Event:0x4a5caa09}
It seems no Data arrives at Kibana at all.
How do i fix this?

Yours faithfully
Stefan Malte Schumacher

The useragent filter expect to parse a string field called [agent], which I would expect to have been created by the COMBINEDAPACHELOG pattern. However, it is finding that the [agent] field is a hash, not a string.

What does an event look like in the rubydebug output?

Hello,

I have started logstash manually and monitored the output. There are sections like the following:
(Hostnames and IPs replaced by XXX)

"log" => {
"offset" => 36099,
"file" => {
"path" => "/var/log/nginx/access_corporate-lounge.de.log"
}
},
"tags" => [
[0] "beats_input_codec_plain_applied",
[1] "nginx-geoip"
],
"clientip" => "89.XXX",
"request" => "/",
"message" => "89.XXX - - [26/Feb/2020:13:03:46 +0100] "GET / HTTP/1.1" 301 178 "-" "check_http/v2.2 (monitoring-plugins 2.2)"",
"httpversion" => "1.1",
"ident" => "-",
"referrer" => ""-"",
"auth" => "-",
"verb" => "GET",
"@timestamp" => 2020-02-26T12:03:46.000Z,
"response" => 301,
"ecs" => {
"version" => "1.1.0"
},
"agent" => {
"id" => "b79a760d-b445-430b-86eb-c27229ebea56",
"ephemeral_id" => "32cda409-1a33-4864-a478-9c83110f45ce",
"version" => "7.5.2",
"hostname" => "xxxx",
"type" => "filebeat"
},
"@version" => "1",
"host" => {
"containerized" => false,
"os" => {
"family" => "debian",
"version" => "9 (stretch)",
"codename" => "stretch",
"kernel" => "4.9.0-8-amd64",
"platform" => "debian",
"name" => "Debian GNU/Linux"
},
"hostname" => "xxxxxx",
"id" => "522a68580a704f4b85b17ef9c7e870a7",
"architecture" => "x86_64",
"name" => "xxxxx"
},
"bytes" => 178,
"geoip" => {
"region_code" => "NH",
"ip" => "89.XXX",
"timezone" => "Europe/Amsterdam",
"country_code2" => "NL",
"city_name" => "Schellinkhout",
"latitude" => 52.6371,
"country_name" => "Netherlands",
"country_code3" => "NL",
"postal_code" => "1697",
"continent_code" => "EU",
"region_name" => "North Holland",
"longitude" => 5.1224,
"location" => {
"lat" => 52.6371,
"lon" => 5.1224
}
},
"input" => {
"type" => "log"

interspersed with the errors seen in the logstash logs:
[ERROR] 2020-02-26 13:08:45.590 [[main]>worker1] useragent - Uknown error while parsing user agent data {:exception=>#<TypeError: cannot convert instance of class org.jruby.RubyHash to class java.lang.String>, :field=>"agent", :event=>#LogStash::Event:0x18ece00f}

[ERROR] 2020-02-26 13:07:50.580 [[main]>worker2] useragent - Uknown error while parsing user agent data {:exception=>#<TypeError: cannot convert instance of class org.jruby.RubyHash to class java.lang.String>, :field=>"agent", :event=>#LogStash::Event:0x596ebf58}

Is this of any help in diagnosing the problem?

Yours faithfully
Stefan

filebeat is adding this hash called agent, which conflicts with the use of COMBINEDAPACHELOG

And how do I prevent filebeat from adding this hash? You have to remember that I am just beginning to work with ELK and that things which might be very easy to an experienced user are still difficult to me.
Yours
Stefan

I do not run filebeat so I do not know if you can prevent filebeat adding it, but you can certainly mutate+rename it to something else to eliminate the conflict.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.