Hi,
- We are noticing that logstash is parsing the
@timestamp
field even before thefilter{}
is triggered and this is causing the warningError parsing @timestamp string value
and spamming our logs - We push logs from fluent-bit --> logstash --> elasticsearch
How to reproduce:
- setup a docker-compose.yaml for elasticsearch and kibana
- now run a logstash server with the following pipeline
input {
http {
port => 8080
}
}
filter {
if [@timestamp] {
mutate {
remove_field => [ "@timestamp" ]
}
ruby {
code => "event.set('logstash_processed_at', Time.now());"
}
mutate {
convert => { "logstash_processed_at" => "string" }
}
date {
match => [ "logstash_processed_at", "ISO8601" ]
target => "@timestamp"
add_tag => "timestamp_changed"
}
}
}
output {
elasticsearch {
ssl_certificate_verification => false
hosts => ["elasticsearch:9200"]
manage_template => true
template_name => "k8s"
index => "test"
user => "elastic"
password => "PASSWORD_HERE"
codec => "json"
}
}
- now setup a fluent-bit container which pushes logs to logstash server
[SERVICE]
Log_Level info
[INPUT]
Name dummy
Dummy {"@message":"HTTP GET /prometheus","@timestamp":"2022-07-01 20:38:53"}
[OUTPUT]
Name http
Host logstash
Port 8080
Format json
that's when we notice that logstash server is throwing warning [2022-07-01T23:44:36,496][WARN ][org.logstash.Event ][main][0f6b528c835b85f2aeef2fb1488d39cbb85d83d9b67ac664cabd98dae739f089] Error parsing @timestamp string value=2022-07-01 20:38:53
and as you can see from the pipeline config, we are removing the existing @timestamp
field and updating with a new value.
You can also see in the below screenshot that the field added above logstash_processed_at
exists from the logs
Basically we want to ignore the @timestamp
field IF it exists in the logs and use the current time and not see that Error parsing @timestamp
warning logs.
- Thank you