I'm trying to index Logstash log with filebeat Logstash module. Everything works fine except @timestamp.
This is origin log text:
[2019-09-30T09:22:27,323][DEBUG][org.logstash.beats.BeatsHandler] [local: 0:0:0:0:0:0:0:1:5044, remote: 0:0:0:0:0:0:0:1:7675] Sending a new message for the listener, sequence: 612
The timestamp comes without timezone, which is +08:00. So in fact, the time part should be:
2019-09-30T09:22:27,323+0800
In default pipeline, it process timestamp like this:
{
"date": {
"field": "logstash.log.timestamp",
"target_field": "@timestamp",
"formats": [
"ISO8601"
],
"ignore_failure": true
}
},
{
"date": {
"if": "ctx.event.timezone != null",
"field": "@timestamp",
"formats": ["ISO8601"],
"timezone": "{{ event.timezone }}",
"on_failure": [{"append": {"field": "error.message", "value": "{{ _ingest.on_failure_message }}"}}]
}
},
The event.timezone is "+08:00", and it works fine. But in Elasticsearch, the @timestamp is
2019-09-30T17:22:27,323+0800
which is 8 hours later. So there is something wrong.
I found simliar problem in https://discuss.elastic.co/t/filebeat-7-3-syslog-auth-log-timezone-parsing-error/196031, so I change to a more simple configuation to avoid it, like this:
{
"date" : {
"field" : "logstash.log.timestamp",
"target_field" : "@timestamp",
"formats" : [
"ISO8601"
],
"timezone" : "+08:00",
"on_failure" : [{"append" : {"field" : "error.message","value" : "{{ _ingest.on_failure_message }}"}}]
}
}
But nothing changed, date processor always read field that doesn't come with timezone in UTC, and offset to timezone I set in pipeline, and then output. So the time will never be correct. My problem is can I make date processor to read the time string with timezone I set. Instead of UTC, I don't care what timezone the output is, because the output always contians timezone information, so Kibana can convert correctly.