I've been stuck for 4 days on this problem.
The logs that appear in Kibana have their field time changed (+2 hours) + the logs that appear in kibana are logs from two hours ago.
It's currently 14h10, here is my last log that appear in logstash :
For example :
The field time is the date extracted from the message, you can see that there is a shift of 2 hours.
That's the first problem.
The second problem is that the last log that should be retrieve should be another one from 14h25 not this one.
I don't know if this other problem is from logstash or kibana but only logs that are at least two hours old from the current time are recovered. So to get a log from 2pm I have to wait until it's 4pm at home. What the fuck.
Here is my logstash.conf file :
input {
file {
path => "/var/log/all_logs/common/logs/**/*.log"
start_position => "beginning"
sincedb_path => "/dev/null"
codec => multiline {
pattern => "^\d{4}-\d{2}-\d{2}"
negate => true
what => "previous"
}
type => "app"
}
}
filter{
if [type] == "app" {
grok{
match => {"message" => [
"%{TIMESTAMP_ISO8601:time} %{LOGLEVEL:log_level} %{GREEDYDATA:message_of_log}",
"%{TIMESTAMP_ISO8601:time} \[ %{LOGLEVEL:log_level}\] %{GREEDYDATA:message_of_log}\(%{GREEDYDATA}\)",
"%{TIMESTAMP_ISO8601:time} \[ %{LOGLEVEL:log_level}\] %{GREEDYDATA:message_of_log}",
"%{TIMESTAMP_ISO8601:time} \[%{LOGLEVEL:log_level}\] %{GREEDYDATA:message_of_log}",
"%{TIMESTAMP_ISO8601:time} \[%{DATA:miscellaneous}\] %{LOGLEVEL:log_level} %{GREEDYDATA:message_of_log}"
]}
}
date{
match => ["time", "yyyy-MM-dd HH:mm:ss.SSS", "yyyy-MM-dd HH:mm:ss","dd-MMM-yyyy HH:mm:ss.SSS","yyyy-MM-dd HH:mm:ss,SSS"]
timezone => "Europe/Brussels"
target => "@time"
}
}
}
Before trying to fix this solution I didn't had this
date {
}
I was converting the time field to the date format in my index :
"time": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss.SSS||yyyy-MM-dd HH:mm:ss||dd-MMM-yyyy HH:mm:ss.SSS||yyyy-MM-dd HH:mm:ss,SSS",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},