Unable to ignore @timestamp in logs

Hi,

We send multiple types of logs to Logstash and one of them uploaded JSON logs with an "@timestamp" field. Logstash tries to parse that field and when it fails it copies the value into "_@timestamp" and throws a warning message for log event.

  • We DO NOT want to parse that field, we are happy with the "_@timestamp" field but having a warning for every event is not feasible for us. Is there a way to suppress just those warnings? (we don't want to update logging format because we will miss other essential warnings).

  • I tried to rename/remove the "@timestamp" (assuming that Logstash will create a new "@timestamp" field with syslog timestamp since it is a protected field) but it didn't work and I still see logstash trying to parse the "@timestamp" from the logs. I tried the following config:

filter{
      if [log] =~ "^\{.*\}[\s\S]*$" {
        json {
          skip_on_invalid_json => true
          source => "log"
          remove_field => ["log", "@timestamp"]
        }
      }
      mutate {
        rename => {"@timestamp" => "@timestamp_orig" }
      }
}

Example log:

{"@message":"HTTP GET /health","@timestamp":"2021-03-13 07:05:01","@fields":{"meta":{"req":{"url":"/health","headers":{"host":"10.19.206.178:8080","user-agent":"kube-probe/1.17","accept-encoding":"gzip","connection":"close"},"method":"GET","httpVersion":"1.1","originalUrl":"/health","query":{}},"res":{"statusCode":200},"responseTime":0},"level":"info"}}

No, it is unconditional.

@Badger
Thank you for your reply.
Is there a way we can set the "@timestamp" to null so that it will be populated by Logstash automatically?

OR parsing the timestamp is the ONLY option for us? The reason why we don't want to parse the "@timestamp" is because there is a good chance that another application in the future will have a new format of "@timestamp" and then we will be back to square one.

  • Thank you

You can use mutate+remove_field to remove the default @timestamp field, then set it to the current time using the configuration from this thread.

@Badger
I tried the following but still no luck

      mutate {
        remove_field => [ "@timestamp" ]
      }
      ruby {
        code => "event.set('logstash_processed_at', Time.now());"
      }
      mutate {
        convert => { "logstash_processed_at" => "string" }
      }
      date {
        match => [ "logstash_processed_at", "ISO8601" ]
        target => "@timestamp"
        add_tag => "timestamp_changed"
      }

In the logs I do see

logstash_processed_at: 2021-03-14T19:17:20.364Z

  • Thank you

I don't know what to say -- that code works for me.

I figured out the issue, the problem was that since I used the JSON filter, it automatically parses the "@timestamp" and throws the warning. I had to get a little bit creative like this:

      if [log] =~ "^\{.*\}[\s\S]*$" {
        json {
          skip_on_invalid_json => true
          source => "log"
          target => "log_json"
        }
      }
      if [log_json][@timestamp] {
        mutate {
          remove_field => "[log_json][@timestamp]"
        }
      }
      if [log_json] {
        json {
          skip_on_invalid_json => true
          source => "log_json"
        }
      }
      if [log_json] {
        ruby {
            code => '
                event.get("log_json").each { |k, v|
                    event.set(k,v)
                }
                event.remove("log_json")
            '
        }
      }

Thanks for all your help :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.