Epoch Time Via Logstash

I have a JSON Log file that can ingest properly ALL with the exception of the @timestamp field.

Here is the format:

"@timestamp":1734103443540

I have tried to add the below in my logstash CONF file but it does not work. In Elastic I still see the ingest time not the actual log file message time.

What can I do in my Filter to make this actually work? The rename of the field works, but the rest does not .

filter {
  # Ensure the UNIX timestamp is converted correctly
  mutate {
    rename => { "@timestamp" => "epoch_time" } # Ensure @timestamp is treated as a string (if needed)
  }
  # Ensure epoch_time is treated as a string (sometimes needed for date filter compatibility)
  mutate {
    convert => { "epoch_time" => "string" }
  }
  date {
    match => [ "epoch_time", "UNIX_MS" ] # Replace "@timestamp" with your actual field name
    target => "@timestamp"              # Store the converted time back in @timestamp
    add_field => { "converted_time" => "%{@timestamp}" } # Add a field for debugging
  }
  
  # Parse JSON from the 'message' field if applicable
  json {
    source => "message"
  }

Can you share a complete sample message and the rest of your pipeline?

Is your date field from your source message also named @timestamp?

Also, share how the output of Logstash looks like.

Yes, the date field in the source message is also @timestamp.

The Logstash Output looks perfect, with the exception of the Time issue. I can see every column correctly.

Here is the Sample Message

{"@timestamp":1734103443540,"_document_id":"znApqDqO5e4q05rUgVPqKw","action":"org.audit_log_export","actor":"TestUser","actor_id":123456,"actor_ip":"1.1.1.1.1","actor_is_bot":false,"actor_location":{"country_code":"US"},"created_at":1734103443540,"operation_type":"create","org":"ABCCorp","org_id":1222222,"query_phrase":"","request_access_security_header":null,"user_agent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36"}

Most of your filters need to be after the JSON is parsed. But the json filter will not parse that timestamp. It should be logging

Unrecognized @timestamp value, setting current time to @timestamp, original in _@timestamp field {:value=>"1734103443540"}

I think what you want is

# Parse JSON from the 'message' field if applicable
json { source => "message" }

# Ensure the UNIX timestamp is converted correctly
mutate {
    rename => { "_@timestamp" => "epoch_time" } # Ensure @timestamp is treated as a string (if needed)
}
# Ensure epoch_time is treated as a string (sometimes needed for date filter compatibility)
mutate {
    convert => { "epoch_time" => "string" }
}
date {
    match => [ "epoch_time", "UNIX_MS" ] # Replace "@timestamp" with your actual field name
    target => "@timestamp"              # Store the converted time back in @timestamp
    add_field => { "converted_time" => "%{@timestamp}" } # Add a field for debugging
}

which produces

                    "@timestamp" => 2024-12-13T15:24:03.540Z,
                    "epoch_time" => "1734103443540",
                    "created_at" => 1734103443540,
                "converted_time" => "2024-12-13T15:24:03.540Z",
3 Likes

Absolutely. That worked. Thanks very much.