Unable to replace @timestamp with some other field in logstash

I want to map @timestamp value with some other field. I referred your answers online but could not resolve the issue.

I have converted string to date and want this date to be reflected in @timestamp. but no luck.

Below is the filter condition

filter {
if [type] == “syslog” {
grok {
match => { “message” => “%{SYSLOGLINE}” }
}
mutate {
split => { “message” => “~” }
add_field => {“QueueManagerName” => “%{message[0]}”}
add_field => {“Date” => “%{message[1]}”}
add_field => {“Time” => “%{message[2]}”}
add_field => {“hh:mm” => “%{message[3]}”}
add_field => {“Shift” => “%{message[4]}”}
add_field => {“Queue” => “%{message[5]}”}
add_field => {“maxDepth” => “%{message[6]}”}
add_field => {“P_Put” => “%{message[10]}”}
add_field => {“Get_P” => “%{message[14}”}
add_field => {“timestamp” => “%{Date} %{Time}”}
}
date {
target => "timestamp"
match => [ “timestamp”, “yyyy-MM-dd HH:mm:ss”, “ISO8601” ]
timezone => “UTC”
}
}
}

The message in the logfile is separated by ~ so used split to assign each value in different fields.

timestamp and @timestamp does not have same value

timestamp contains value from log but @timetsamp sets to current time

Can you please help to figure out the issue.

Thanks in advance.

Please show us an example event. Use a stdout { codec => rubydebug } output.

timestamp and @timestamp does not have same value

Why would they have the same value when you've configured the date filter to store the parsed timestamp into the timestamp field, leaving @timestamp untouched?

Hi,
Thanks. I have changed the code and adding the new calculated datetime value in a new field called eventtimestamp leaving @timestamp alone.

under mutate i have added below line

add_field => {"eventtimetsamp" => "%{Date} %{Time}"}

and below is my date code

date {
target => "eventtimestamp"
match => [ "eventtimestamp", "yyyy-MM-dd HH:mm:ss.sssZ", "ISO8601" ]
timezone => "UTC"
}

Now, the expectation is this new field will be a date field that I can use in Kibana, but this field is coming as string. Can you please take a look at my code and tell me what I am doing wrong here?

Also the output plugin code is
output {
elasticsearch {
hosts => “:9200” index => “logstash-%{+YYYY.MM.dd}” user => “logstash_user” password => “”
}
stdout {codec => rubydebug}
}

Now, the expectation is this new field will be a date field that I can use in Kibana, but this field is coming as string. Can you please take a look at my code and tell me what I am doing wrong here?

Is the date filter successful? What does an event look like? What probably happened here is that you sent a document to ES with a eventtimestamp value not being parseable as a date, so it was mapped as a string. Since the mapping of a field can't be changed after the fact for a particular index it made no difference that subsequent documents may have had eventtimestamp fields that would've been recognized as dates. Unless you have precious data in the index just delete it and run Logstash again to index new data.

This is another reason why it's a good idea to only enable the elasticsearch output once you've verified with e.g. a stdout output that things are working.

Thanks. It worked by updating the index

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.