Logstash parsing @timestamp even before the filter is triggered

Hi,

  • We are noticing that logstash is parsing the @timestamp field even before the filter{} is triggered and this is causing the warning Error parsing @timestamp string value and spamming our logs
  • We push logs from fluent-bit --> logstash --> elasticsearch

How to reproduce:

  • setup a docker-compose.yaml for elasticsearch and kibana
  • now run a logstash server with the following pipeline
input {
  http {
    port => 8080
  }
}
filter {
  if [@timestamp] {
      mutate {
        remove_field => [ "@timestamp" ]
      }
    ruby {
    code => "event.set('logstash_processed_at', Time.now());"
    }

    mutate {
    convert => { "logstash_processed_at" => "string" }
    }

    date {
      match => [ "logstash_processed_at", "ISO8601" ]
      target => "@timestamp"
      add_tag => "timestamp_changed"
    }
  }
}
output {
  elasticsearch {
    ssl_certificate_verification => false
    hosts => ["elasticsearch:9200"]
    manage_template => true
    template_name => "k8s"
    index => "test"
    user => "elastic"
    password => "PASSWORD_HERE"
    codec => "json"
  }
}

  • now setup a fluent-bit container which pushes logs to logstash server
[SERVICE]
    Log_Level    info

[INPUT]
    Name dummy
    Dummy {"@message":"HTTP GET /prometheus","@timestamp":"2022-07-01 20:38:53"}
[OUTPUT]
    Name http
    Host logstash
    Port 8080
    Format json

that's when we notice that logstash server is throwing warning [2022-07-01T23:44:36,496][WARN ][org.logstash.Event ][main][0f6b528c835b85f2aeef2fb1488d39cbb85d83d9b67ac664cabd98dae739f089] Error parsing @timestamp string value=2022-07-01 20:38:53 and as you can see from the pipeline config, we are removing the existing @timestamp field and updating with a new value.

You can also see in the below screenshot that the field added above logstash_processed_at exists from the logs

Basically we want to ignore the @timestamp field IF it exists in the logs and use the current time and not see that Error parsing @timestamp warning logs.

  • Thank you

If Format json cause the post into the http input to have Content-Type: application/json, then the input will run the POST body through a json codec. When the input tries to create the event from the parsed JSON it is getting an exception down inside initTimestamp. Apparently "2022-07-01 20:38:53" is not a supported date format in that context.

You could set additional_codes => {} to override the default setting on the input, then use something like

mutate { gsub => [ "message", '"@timestamp"\s*:\s*"[^"]*"', "" ] }

to strip the timestamp out, then use a json filter to parse the JSON.

Alternatively, try

Dummy {"@message":"HTTP GET /prometheus","@timestamp":"2022-07-01T20:38:53.000Z"}

Thank you, that makes sense.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.