Grokparsefailure Time set up +1hour

Hi members,

so I have a log with three entries.

26.01.2018 15:57:12 Hostname: ApplicationName : INFO - Message
26.01.2018 15:57:12 Hostname: ApplicationName : INFO - Message /configFile=Greedydata /logConfigFolder=Greedydata /step=Greedydata 
26.01.2018 15:57:17 Hostname: ApplicationName : INFO - Message /configFile=Greedydata /logConfigFolder=Greedydata /step=Greedydata

And from the last two I got an grokparsefailure.

2018-05-16%2010_02_32-Discover%20-%20Kibana

The strange thing is, that the time is +1hour in every entry, where does that come from? The data in Kibana is parsed right through all three entries! So why is not every entry an grokparsefailure because of the time? And where else could that grokparsefailure come from?

I also debugged the grokparse section. It's right formatted.

my filter

filter {
        if [fields][LogEvent] == "ApplicationConflict" {
                grok {
                        match => {"source" => "%{PATH}\\%{GREEDYDATA:NameLogDatei}.log"}
                }
                if "/configFile" in [message] {
                        grok {
                                match => {"message" => "%{DATE:Datum} %{TIME:Uhrzeit} %{HOSTNAME:Hostname} : %{GREEDYDATA:ApplicationName} : %{LOGLEVEL:LogLevel} - %{GREEDYDATA:message} /configFile=%{GREEDYDATA:configFile} /logConfigFolder=%{GREEDYDATA:logConfigFolder} /step=%{GREEDYDATA:WP_Step}"}
                                overwrite => ["message"]
                                add_field => {"timestamp" => "%{Datum} %{Uhrzeit}"}
                                remove_field => ["Datum", "Uhrzeit"]
                                add_tag => "3"
                        }
                }
                grok {
                        match => {"message" => "%{DATE:Datum} %{TIME:Uhrzeit} %{HOSTNAME:Hostname} : %{GREEDYDATA:ApplicationName} : %{LOGLEVEL:LogLevel} - %{GREEDYDATA:message}"}
                        overwrite => ["message"]
                        add_field => {"timestamp" => "%{Datum} %{Uhrzeit}"}
                        remove_field => ["Datum", "Uhrzeit"]
                        add_tag => "3"
                }
                date {
                        match => ["timestamp", "dd.MM.yyyy HH:mm:ss"]
                        target => ["@timestamp"]
                        remove_field => ["timestamp"]
                }
        }
}

The first grok ("%{PATH}\%{GREEDYDATA:NameLogDatei}.log") does not match any of those lines, so you should always gets a _grokparsefailure.

The sample log entries do not have a space after hostname, so the other patterns do not match.

Are you in CET? elasticsearch timestamps are always UTC. You can specify your timezone in the date filter.

First thanks for your reply.

The first grok ("%{PATH}%{GREEDYDATA:NameLogDatei}.log") does not match any of those lines, so you should always gets a _grokparsefailure.

To this: D:\LOGFILE\SOME\SOME\SOME\NameLogDatei*.log
This only reads from the source which File is used and not from the message. In my opinion thats fine, it also works where I dont have a grokparsefailure.

The sample log entries do not have a space after hostname, so the other patterns do not match.

I am sorry that was my mistake there is a space you are right, but in the original log there is a space. I just had to modify the Logs a bit :wink:

Are you in CET? elasticsearch timestamps are always UTC. You can specify your timezone in the date filter.

Thats interesting!!! Ok, that could explain it. Yes I am in CET. But proove me wrong, I am trying to use the original LogTime thats why I am overwriting the timestamp where it has been send to the ELK and using the Log-Timestamp.
So what you suggest would assume that the UTC will be set on top of my logtime configuration, right?

elasticsearch uses UTC timestamps, and if you do not tell it otherwise then logstash has to assume your timestamps are in UTC. If they are in CET then add 'timezone => "CET"' to the date filter.

Ok now its the right timestamp thanks a lot.
I still got grokparsefailure. I thought it would depend on the timestamp but thats not true.
But it's fine so far.
Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.