Logstash GROK

Hello Community,

I'm a bit lost with creating a grok filter and need some help...
I am using filebeat for AIX to send errpt messages with syslog to logstash and Kibana to visualize...
The Log coming from AIX error_deamon is collected by syslog and looks that way:

Sep 11 08:32:27 svrseng3-0 local4:warn|warning root: IDENTIFIER: AA8AB241 Sequence Number: 36 Machine Id: 00C0C5504B00 Node Id: svrseng3-0 Class: O Type: TEMP WPAR: Global Resource Name: OPERATOR Description OPERATOR NOTIFICATION User Causes ERRLOGGER COMMAND Recommended Actions REVIEW DETAILED DATA Detail Data MESSAGE FROM ERRLOGGER COMMAND test for logstash active filters and output config

The filter should looks like:

filter {
        if [type] == "aix-beat" {
              grok {
                match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{SYSLOGFACILITY} %{USERNAME} %{GREEDYDATA:syslog_message}" }
                add_field => [ "received_at", "%{@timestamp}" ]
                add_field => [ "received_from", "%{host}" ]
                }
                date {
                  match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
                }
        }
}

Unfortunately, it is not working...
When I try to start logstash with this filter I get an error and logstash is shutting down again...

[2020-09-14T10:04:37,705][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Duplicate keys found in your configuration: [add_field]\nAt line: 4, column 22 (byte 78)\nafter filter {\n    if [type] == \"aix-beat\" {\n          grok {\n            match => {", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler/lscl.rb:183:in `validate!'"

Can anybody help me out? I'm trying to get that work since a week...

Thanks a lot in advance your your support.

Regards
Joerg

Hello Jörg,

Welcome to this forum! The error message is very clear about the root cause:

Duplicate keys found in your configuration: [add_field]

In your grok filter, you have the property add_field twice and this is the error. You can add multiple fields like this:

filter {
        if [type] == "aix-beat" {
              grok {
                match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{SYSLOGFACILITY} %{USERNAME} %{GREEDYDATA:syslog_message}" }
                add_field => {
                          "received_at" =>  "%{@timestamp}"
                          "received_from" => "%{host}" 
                    }
                }
                date {
                  match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
                }
        }
}

See here for details: Grok filter plugin | Logstash Reference [master] | Elastic

Best regards
Wolfram

Found it, thanks a lot... it was a wrongly set "}".

Now it works so far... but all entries are doubled in Kibana...
Any Idea where that comes from?

best regards
Joerg

I don't think LogStash does the duplication - maybe the syslog message is sent twice? We once had the problem that the message was written to 2 different loggers which then forwarded it to syslog...

The message is written only one time in the logfile on the sending host...
So you mean filebeat could be the problem??

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.