@timestamp field cannot change even date filter is correct: default to YYYY.01.01

Hello,

Im having hard time to know why i received the logs in may ES with a @timestamp": "2017-01-01T09:32:25.532Z" which should not be the case. as you can see from below ES data, field "received_at": "2017-10-16T09:34:02.956Z", but my default date appearing is 2017.01.01. how this can be? it's also affecting the index, which also wrong, even my date filter doesn't work. (see below logstash filter). I'm also seeing filebeat is sending @timestamp with current date of readlines which is correct. I'm using logstash 5.6

filter:

else if [type] == "applog" {
grok {
match => {"message" => "%{DATA:timestampko} .*"}
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
if "UTC " in [message] {
ruby {
code => "
event.set('dateko', Time.now.strftime('%Y-%m-%d'))
"
}
mutate {
add_field => {
"akkatimestamp" => "%{dateko}.%{timestampko}"
}
remove_field => ["timestampko", "newtimestamp"]
}
date {
match => [ "akkatimestamp" , "YYYY-MM-dd.HH:mm:ss.SSSZZZ", "yyyy-MM-dd.HH:mm:ss.SSSZZZ" ]
timezone => "UTC"
target => "@timestamp"
}
if [host] == "ip-x-x-x-x" or [host] == "ip-x-x-x-x" {
date {
match => [ "akkatimestamp" , "YYYY-MM-dd.HH:mm:ss.SSSZZZ", "yyyy-MM-dd.HH:mm:ss.SSSZZZ" ]
timezone => "UTC"
target => "akkatimestamp"
}
mutate {
remove_field => "timestamp"
}
ruby {
code => "event.set('@timestamp', event.get('akkatimestamp'));"
}
}
}

ES:

{
"_index": "xxxxx-applog-2017.01.01",
"_type": "applog",
"_id": "%{[@metadata][fingerprint]}",
"_version": 44980,
"_score": null,
"_source": {
"dateko": "2017-10-16",
"syslog_severity_code": 5,
"offset": 212367,
"syslog_facility": "user-level",
"input_type": "log",
"syslog_facility_code": 1,
"newtimestamp": "%{date} 09:32:25.532UTC",
"source": "/x/logs/x.log",
"message": "09:32:25.532UTC DEBUG c.x.x.x.MapSsnActor MapSsnActor(x://x) - close request dialog id=8399289",
"type": "applog",
"syslog_severity": "notice",
"tags": [
"applogs",
"beats_input_codec_plain_applied",
"_grokparsefailure",
"_rubyexception",
"_dateparsefailure"
],
"akkatimestamp": "2017-10-16T09:32:25.532Z",
"received_from": "ip-10-x-x-x",
"@timestamp": "2017-01-01T09:32:25.532Z",
"received_at": "2017-10-16T09:34:02.956Z",
"%{": {
"@metadata": {
"fingerprint": {
"}": 465661659
}
}
},
"@version": "1",
"beat": {
"name": "ip-10-x-x-x",
"hostname": "ip-10-x-x-x",
"version": "5.6.2"
},
"host": "ip-10-x-x-x",
"timestamp": "09:32:25.532UTC"
},
"fields": {
"@timestamp": [
1483263145532
],
"dateko": [
1508112000000
],
"received_at": [
1508146442956
],
"akkatimestamp": [
1508146345532
]
},
"sort": [
1483263145532
]
}

Already Fixed this :), using some ruby plugins and removing "_id": "%{[@metadata][fingerprint]}",

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.