Replace the @timestamp of the dashboard with the time at which the event get logged in the file

Hi there,

I wanted to replace the @timestamp value with the log time. I am using logstash as a logs collector. This is not working for me and it still adds my local time stamp to the logs parsed.

Here is my configuration value which I am trying

# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.

input {
  file {
    path => "/Users/fsyed/workspaces/vault_logs/tmp/*.log"
    start_position => "beginning"
    max_open_files => 64000
    codec => "json"
    sincedb_path => "/dev/null"
  }
}

filter {
  json {
    source => "message"
  }
  date {
        match => [ "time", "MMM dd, yyyy @ HH:mm:ss.SSS" ]
        target => "@timestamp"
      }
  mutate {
        remove_field => ["[log][file][path]"]
    }

}

output {
 stdout { codec => rubydebug }
 elasticsearch {
    hosts => [ "https://0.0.0.0:9200" ]
    ssl_certificate_verification => false
    user => "elastic"
    password => "##########"
    index => "vault-%{+YYYY.MM.dd}"
  }
}

Here is a the output of my audit log file.

 "file": "/var/log/vault/audit.log",
  "host": "ip-10-66-9-115",
  "message": "{\"time\":\"2024-04-10T07:54:10.196277961Z\",\"type\":\"response\",\"auth\":{\"client_token\":\"hmac-sha256:fd91e5d3670b9302c0a51a8cfeaa4121993c09e127cd55a2f9d4a15ba304cea5\",\"accessor\":\"hmac-sha256:ab037477767efa3229a2e55ceb6e89e8111f645fb9c0a6e8e9075518a9bd3bc5\",\"display_name\":\"k8s-ripplenet-prod-liquidity-voltron-eng-wf-wallet-funding-balance\",\"policies\":[\"default\",\"locus-3b416-balance-app-policy\"],\"token_policies\":[\"default\",\"locus-3b416-balance-app-policy\"],\"policy_results\":{\"allowed\":true,\"granting_policies\":[{\"name\":\"locus-3b416-balance-app-policy\",\"namespace_id\":\"root\",\"type\":\"acl\"}]},\"metadata\":{\"role\":\"locus-3b416-balance-app-role\",\"service_account_name\":\"wallet-funding-balance\",\"service_account_namespace\":\"liquidity-ewewde-eng-wf\",\"service_account_secret_name\":\"\",\"service_account_uid\":\"93e727b5-9c52-4f01-959c-dfc5af57f0b1\"},\"entity_id\":\"016e1cb6-dc03-e745-c3c6-cb788aeb644c\",\"token_type\":\"service\",\"token_ttl\":86400,\"token_issue_time\":\"2024-04-02T23:02:54Z\"},\"request\":{\"id\":\"486df7c9-322e-b750-d542-8e79bca277e7\",\"client_id\":\"016e1cb6-dc03-e745-c3c6-cb788aeb644c\",\"operation\":\"update\",\"mount_point\":\"data_encryption/liquiltron/eng/strato/\",\"mount_type\":\"transit\",\"mount_accessor\":\"transit_27361b44\",\"mount_running_version\":\"v1.14.1+builtin.vault\",\"mount_class\":\"secret\",\"client_token\":\"hmac-*****************\",\"client_token_accessor\":\"hmac-sha***************************************\",\"namespace\":{\"id\":\"root\"},\"path\":\"data_encryption/key\",\"data\":{\"ciphertext\":\"hmac-sha*********************************************\",\"context\":\"hmac-sha*************************************************\"},\"remote_address\":\"10.66.11.164\",\"remote_port\":37642},\"response\":{\"mount_point\":\"data_encryption/eng/strato/\",\"mount_type\":\"transit\",\"mount_accessor\":\"transit_27361b44\",\"mount_running_plugin_version\":\"v1.14.1+builtin.vault\",\"mount_class\":\"secret\",\"data\":{\"plaintext\":\"hmac-sha256:a********************************************\"}}}",
  "source_type": "file",
  "timestamp": "2024-04-10T07:54:10.404110412Z"
}

I tried to update it to use this configuration but it created two indices with different time stamp.

filter {
  json {
    source => "message"
  }
  date {
        match => [ "timestamp", "yyyy-MM-dd'T'HH:mm:ss.SSSSSSSSZZ" ]
        target => "@timestamp"
      }
  mutate {
        remove_field => ["[log][file][path]"]
    }

I think that should be match => [ "time", ISO8601 ]

1 Like

Thanks for your reply. This resolved my issue but I am seeing intermittent errors in the logstash logs. Any idea what could be wrong with my configuration.

:response=>{"index"=>{"status"=>400, "error"=>{"type"=>"document_parsing_exception", "reason"=>"[1:12125] object mapping for [response.data.keys] tried to parse field [null] as object, but found a concrete value"}}}}
,\"hmac-sha256:4449f8c39f55368d2f2282159f7f25a73ff6449446fca2b48f86eafdf72fa354\",\"hmac-sha256:1b39da8d22d555841eec6da1096967915c0467febf9f0d636529fea74d279deb\",\"hmac-sha256:2cd751d486c9fc828588e25ff0222a9a4d2b78029dd5d6aded2494695e0de98e\",\"hmac-sha256:aecdcef6691493a161b26be1ff01a36a1da4a9d29cb8c69f752766e585e70abd\",\"hmac-sha256:13c3e6d5b9fcf8d305ad91f70ef71ce9746ed9fe90c6c9e5a59d9aee888f4b38\",\"hmac-sha256:5f7a6d882e436f0b6269d40724ec9260fd4bac6f1bed970ac8b159ee4caa1a78\",\"hmac-sha256:0b4b64ae1fc658e6827bb26e24dfa7bfb09f379590a3b9bff832b4e673370917\",\"hmac-sha256:9437f0d005ebdfb354b23c0d625eebe9af983daffb97378ddd642e6cc864a24e\",\"hmac-sha256:b8ab64596a330a55cc7c12a52186c6d5936ea1da1a88008735f930fcc71f39af\",\"hmac-sha256:07448b6c0451a1713d56cfa50f09b3ab5d285b762f6cf0008e887e98de37e2ff\",\"hmac-sha256:4f6b63668cf1d89e7e9d22ebb043dff664bd40d216f74354fad03adef19b0bae\",\"hmac-sha256:f25f52687bb9ccf2a9efb041c0a7d44f3bc9c452843b0a2cd3fcd6d7627e59b4\",\"hmac-sha256:6906fdd02521e33367d9eacd6669a705e5a9ab911f1542e53b2031d61eb7fe0d\",\"hmac-sha256:c96bf5c8897f112a7a7d098630ff5b75108257189cc435863306f884294f4402\",\"hmac-sha256:f9944681deb32b8095d256a7ce0dcbaf137eec5a2ba3e8d11800cf37bae14254\",\"hmac-sha256:e58fcfc2e455a4b8979eb8392d297334a8467e28596cd48594fd557ef0c835a1\",\"hmac-sha256:79415e6e5c46488af9cea213cdfeb250d80103afa6d326fe4af33c979b6fb6e8\",\"hmac-sha256:4db0b607fd4d4eb43077d28c06af3e505e017a876dfc8ca9deeeff0b3633aab9\",\"hmac-sha256:184cf98bf65a99c53bdbbc2002e779246c68ff9d368d6aed034cf930666838f1\",\"hmac-sha256:52e5a11e3055149306c3caeeb1381fedb22e271e1b07f4605b7b622f9cb91613\",\"hmac-sha256:3fca13e82540be62a980089719353ec1ebbe573d4825de09bc83a86c10a2e7e3\",\"hmac-sha256:858b9a83029509b6898a2432b89a02257d0fe1dab25a77a273730287dc885928\",\"hmac-sha256:09b035b7104e1a8e5f55a59541d17b9eb856b9c1406973b173fee3a12bd64771\",\"hmac-sha256:98b9215b16b115f728dad0bc165c290fbfb9a7dc41b32ad50d6ccb523d9d8c19\",\"hmac-sha256:982f4f675689c4bab780cf89c9d026dffe2a65c2520b9ed38184f7112c7cbad4\",\"hmac-sha256:419b09b573b3362375728119963e600dddf4d93d61ae6ff1cacd920eeb5508d8\",\"hmac-sha256:e6623d9b3fe2d3ce831c25a7c952ffc9e1ceb36a2e49cc87d0f21a78873df728\",\"hmac-sha256:ecc887f2fadbcefa94ee7a45b94d7b7b01ea184038322b22c7018d058315cac1\",\"hmac-sha256:c1c8985a817e55b0ac3843aba5f02f284a146b31ae0bbc43eb52f0d2f5bda1ca\",\"hmac-sha256:8d99590aa605cb641e206c03ffc0a92b3fc97828481073fe90407107bec18950\",\"hmac-sha256:9602678cddc2125380e19be06c9457d107e294332e7eb734f073e811bc986af7\",\"hmac-sha256:4536676570dfaf169bb60de4d61e966464479796d75e427f907967df03debfe9\",\"hmac-sha256:9b609f1fac533383d3ca5d8d359989835a705f85cdcd099993d6b15fee4a1f00\",\"hmac-sha256:6872147ea1a5d08950e1cdbdd5187286071169c97d85a65e20314509255a50c1\",\"hmac-sha256:29280240fe6bce3f16b0774944e34413904627b6b70b80792fca4f70a1c9fd1d\",\"hmac-sha256:7a759a7e1cb7cc261d80a08e3d5097e91badbb13dd89768638a8597c84a61761\",\"hmac-sha256:9d8cc48d6157b0bf0cfc2093154de3244b227d40f18ee67aaa7fc8c907ad617c\",\"hmac-sha256:2498d4d4c163c1fad7487f373b04a115e75415397f080a4a11643a1616ceea7d\",\"hmac-sha256:d66a8bc825f94ae5c760b02d4887587fb788a0731a81edbefcce2b026adf5c32\",\"hmac-sha256:4e4bf168160b2aa889036983f448e590748184bb7ba5d593ebacf3c6d614fd10\",\"hmac-sha256:9363da7af27336fa3da26314d39735c48e623c5be8f322efefb8ef28191c159e\"]}}}"}], :response=>{"index"=>{"status"=>400, "error"=>{"type"=>"document_parsing_exception", "reason"=>"[1:12125] object mapping for [response.data.keys] tried to parse field [null] as object, but found a concrete value"}}}}

See this thread. It's really an elasticsearch indexing error. To fix it you need to change the structure of your data.

Typically you will see a message like object mapping for [a.b] tried to parse field [b] as object, but found a concrete value”. That is telling you that ES expects an object that looks like { "a": { "b": "foo" } } but it got something like { "a": "bar" } The value if [a] is an object in the former, but a string in the latter. That is not supported.

I have seen a variant with [null] in the error message for a data structure like

"data": [
    { "a": { "b": "X" } },
    { "a": { "b": "Y" } },
    { "a": [ "Z" ] }
]

I would suggest configuring a DLQ, then process those failing message to a file using a rubydebug codec, then sit down with a failed message and the mapping of the index and figure out where the mismatch is. Once you know what needs to change in the failing events add code to your logstash pipeline to make those changes.

And yes, sometimes getting dataset into elasticsearch is a lot of work.