Discrepancy in Log Filtering in Logstash

Hi All,

We are currently using the filter below in Logstash to process logs.
Filter -

filter {
  # Parse the JSON string from the message field into "parsed_msg"
  json {
    source => "message"
    target => "parsed_msg"
    skip_on_invalid_json => true
  }

  # Process the payload: flatten if it's a hash, or store as a field if it's a string
  if [parsed_msg][payload] {
    ruby {
      code => '
        payload = event.get("parsed_msg")["payload"];
        if payload.is_a?(Hash)
          payload.each { |k, v|
            event.set("fr." + k, v)
          }
        else
          event.set("fr.payload", payload)
        end
      '
    }
  }

  # Extract parsed_msg.source to a top-level field "result_source"
  if [parsed_msg][source] {
    mutate {
      add_field => { "result_source" => "%{[parsed_msg][source]}" }
    }
  }

  # Extract parsed_msg.timestamp to a top-level field "result_timestamp"
  if [parsed_msg][timestamp] {
    mutate {
      add_field => { "result_timestamp" => "%{[parsed_msg][timestamp]}" }
    }
  }

  # Overwrite @timestamp with result_timestamp
  date {
    match => ["result_timestamp", "ISO8601"]
    target => "@timestamp"
  }

  # Generate a unique fingerprint based on the log type
  if [result_source] == "idm-core" {
    fingerprint {
      source => ["[parsed_msg][payload]", "result_source", "result_timestamp"]
      target => "[@metadata][fingerprint]"
      method => "SHA256"
      concatenate_sources => true
    }
  } else {
    fingerprint {
      source => ["fr._id", "fr.timestamp","result_source", "fr.message", "fr.transactionId", "fr.eventName", "result_timestamp","fr.after._id","fr.before._id"]
      target => "[@metadata][fingerprint]"
      method => "SHA256"
      concatenate_sources => true
    }
  }

  # Prune the event to keep only desired fields
  prune {
    whitelist_names => ["^fr.*$", "^@timestamp$", "^@metadata$", "^result_source$", "^result_timestamp$", "^tags$", "^fingerprint$","^fields.*$"]
  }

  # Remove the temporary parsed_msg field
  mutate {
    remove_field => ["parsed_msg"]
  }
}

We have two similar log formats, yet only one is being filtered correctly while the other is not. Could anyone advise on potential reasons for this discrepancy and suggest a resolution?
Log 1 - getting filtered -

{"@timestamp":"2025-02-18T15:51:49.117Z","@metadata":{"beat":"filebeat","type":"_doc","version":"8.15.2","input_id":"generic-httpjson-staging-idm","stream_id":"httpjson-httpjson.staging_idm","raw_index":"logs-httpjson.generic-default"},"event":{"created":"2025-02-18T15:51:49.117Z","dataset":"httpjson.generic"},"tags":["staging_idm"],"input":{"type":"httpjson"},"agent":{"id":"52887bbd-0645-4c47-8e63-4095f2705fa8","ephemeral_id":"8e494863-e58f-4844-972a-3ff1c69bbedb","name":"d1entsttlsr007.europe.aa.local","type":"filebeat","version":"8.15.2"},"ecs":{"version":"8.0.0"},"message":"{\"payload\":{\"_id\":\"185292be-0ecc-4301-88fd-98eac8de94f3-3812871\",\"after\":{\"_id\":\"4b809f69-a399-4a1a-9043-7bedd53de067\",\"_rev\":\"418f3aa6-1048-4a80-9484-2df27fcc36bf-858630\",\"createDate\":\"2025-02-04T14:18:44.354934954Z\",\"lastChanged\":{\"date\":\"2025-02-18T15:50:58.282619577Z\"},\"loginCount\":0},\"before\":{\"_id\":\"4b809f69-a399-4a1a-9043-7bedd53de067\",\"_rev\":\"418f3aa6-1048-4a80-9484-2df27fcc36bf-852420\",\"createDate\":\"2025-02-04T14:18:44.354934954Z\",\"lastChanged\":{\"date\":\"2025-02-18T13:06:37.789551815Z\"},\"loginCount\":0},\"changedFields\":[],\"eventName\":\"activity\",\"level\":\"INFO\",\"message\":\"\",\"objectId\":\"managed/alpha_usermeta/4b809f69-a399-4a1a-9043-7bedd53de067\",\"operation\":\"PATCH\",\"passwordChanged\":false,\"revision\":\"418f3aa6-1048-4a80-9484-2df27fcc36bf-858630\",\"runAs\":\"idm-provisioning\",\"source\":\"audit\",\"status\":\"SUCCESS\",\"timestamp\":\"2025-02-18T15:50:58.293Z\",\"topic\":\"activity\",\"transactionId\":\"e6b42174-165f-43db-b464-135e0f369f6f/0/5\",\"userId\":\"idm-provisioning\"},\"source\":\"idm-activity\",\"timestamp\":\"2025-02-18T15:50:58.294116388Z\",\"type\":\"application/json\"}","fields":{"environment":"staging"},"data_stream":{"dataset":"httpjson.generic","namespace":"default","type":"logs"},"elastic_agent":{"snapshot":false,"version":"8.15.2","id":"52887bbd-0645-4c47-8e63-4095f2705fa8"},"host":{"name":"d1entsttlsr007.europe.aa.local","hostname":"d1entsttlsr007.europe.aa.local","architecture":"x86_64","os":{"platform":"rhel","version":"7.7 (Maipo)","family":"redhat","name":"Red Hat Enterprise Linux Server","kernel":"3.10.0-1160.118.1.el7.x86_64","codename":"Maipo","type":"linux"},"id":"69f417b1a15a420eb9acfd36802ff697","containerized":false,"ip":["10.178.2.12","fe80::477:70ff:fefe:3fdb"],"mac":["06-77-70-FE-3F-DB"]},"cloud":{"service":{"name":"Nova"},"provider":"openstack","availability_zone":"eu-west-1a","instance":{"name":"ip-10-178-2-12.europe.aa.local","id":"i-0528118076d76fe10"},"machine":{"type":"c5.2xlarge"}}
Log 2 not getting filtered - 
{"@timestamp":"2025-02-18T15:51:49.117Z","@metadata":{"beat":"filebeat","type":"_doc","version":"8.15.2","raw_index":"logs-httpjson.generic-default","input_id":"generic-httpjson-staging-idm","stream_id":"httpjson-httpjson.staging_idm"},"cloud":{"machine":{"type":"c5.2xlarge"},"service":{"name":"Nova"},"provider":"openstack","availability_zone":"eu-west-1a","instance":{"id":"i-0528118076d76fe10","name":"ip-10-178-2-12.europe.aa.local"}},"tags":["staging_idm"],"input":{"type":"httpjson"},"fields":{"environment":"staging"},"agent":{"id":"52887bbd-0645-4c47-8e63-4095f2705fa8","version":"8.15.2","ephemeral_id":"8e494863-e58f-4844-972a-3ff1c69bbedb","name":"d1entsttlsr007.europe.aa.local","type":"filebeat"},"ecs":{"version":"8.0.0"},"message":"{\"payload\":{\"_id\":\"185292be-0ecc-4301-88fd-98eac8de94f3-3812872\",\"after\":{\"_id\":\"b28e8d51-b392-44c8-bfa9-b031c3b8eb0c\",\"_rev\":\"418f3aa6-1048-4a80-9484-2df27fcc36bf-858629\",\"accountStatus\":\"active\",\"aliasList\":[],\"assignedDashboard\":[],\"city\":\"Bath\",\"cn\":\"Will West\",\"consentedMappings\":[],\"country\":\"United Kingdom\",\"custom_IDD\":\"undefined\",\"custom_gender\":\"F\",\"custom_languageCodeID\":\"en-gb\",\"custom_mobilephone\":\"76581916480\",\"custom_title\":\"MR\",\"description\":null,\"displayName\":null,\"effectiveApplications\":[],\"effectiveAssignments\":[],\"effectiveGroups\":[],\"effectiveRoles\":[],\"frIndexedDate1\":null,\"frIndexedDate2\":null,\"frIndexedDate3\":null,\"frIndexedDate4\":null,\"frIndexedDate5\":null,\"frIndexedInteger1\":null,\"frIndexedInteger2\":null,\"frIndexedInteger3\":null,\"frIndexedInteger4\":null,\"frIndexedInteger5\":null,\"frIndexedMultivalued1\":[],\"frIndexedMultivalued2\":[],\"frIndexedMultivalued3\":[],\"frIndexedMultivalued4\":[],\"frIndexedMultivalued5\":[],\"frIndexedString1\":\"129447756\",\"frIndexedString2\":\"1/1/2000 12:00:00 AM\",\"frIndexedString3\":null,\"frIndexedString4\":null,\"frIndexedString5\":null,\"frUnindexedDate1\":null,\"frUnindexedDate2\":null,\"frUnindexedDate3\":null,\"frUnindexedDate4\":null,\"frUnindexedDate5\":null,\"frUnindexedInteger1\":null,\"frUnindexedInteger2\":null,\"frUnindexedInteger3\":null,\"frUnindexedInteger4\":null,\"frUnindexedInteger5\":null,\"frUnindexedMultivalued1\":[],\"frUnindexedMultivalued2\":[],\"frUnindexedMultivalued3\":[],\"frUnindexedMultivalued4\":[],\"frUnindexedMultivalued5\":[],\"frUnindexedString1\":null,\"frUnindexedString2\":null,\"frUnindexedString3\":null,\"frUnindexedString4\":null,\"frUnindexedString5\":null,\"givenName\":\"Will\",\"kbaInfo\":[],\"mail\":\"CapTest03022025LI3Gen_15402769@test.com\",\"memberOfOrgIDs\":[],\"postalAddress\":\"23 Lorne Road\",\"postalCode\":\"BA2 3BY\",\"preferences\":{\"ejpartnerupdateandoffer\":true,\"ejupdateandoffer\":true},\"profileImage\":null,\"sn\":\"West\",\"stateProvince\":\"UNDEFINED\",\"telephoneNumber\":null,\"userName\":\"CapTest03022025LI3Gen_15402769@test.com\"},\"before\":{\"_id\":\"b28e8d51-b392-44c8-bfa9-b031c3b8eb0c\",\"_rev\":\"418f3aa6-1048-4a80-9484-2df27fcc36bf-858533\",\"accountStatus\":\"active\",\"aliasList\":[],\"assignedDashboard\":[],\"city\":\"Bath\",\"cn\":\"Will West\",\"consentedMappings\":[],\"country\":\"United Kingdom\",\"custom_IDD\":\"undefined\",\"custom_gender\":\"F\",\"custom_languageCodeID\":\"en-gb\",\"custom_mobilephone\":\"25231149988\",\"custom_title\":\"MR\",\"description\":null,\"displayName\":null,\"effectiveApplications\":[],\"effectiveAssignments\":[],\"effectiveGroups\":[],\"effectiveRoles\":[],\"frIndexedDate1\":null,\"frIndexedDate2\":null,\"frIndexedDate3\":null,\"frIndexedDate4\":null,\"frIndexedDate5\":null,\"frIndexedInteger1\":null,\"frIndexedInteger2\":null,\"frIndexedInteger3\":null,\"frIndexedInteger4\":null,\"frIndexedInteger5\":null,\"frIndexedMultivalued1\":[],\"frIndexedMultivalued2\":[],\"frIndexedMultivalued3\":[],\"frIndexedMultivalued4\":[],\"frIndexedMultivalued5\":[],\"frIndexedString1\":\"129447756\",\"frIndexedString2\":\"1/1/2000 12:00:00 AM\",\"frIndexedString3\":null,\"frIndexedString4\":null,\"frIndexedString5\":null,\"frUnindexedDate1\":null,\"frUnindexedDate2\":null,\"frUnindexedDate3\":null,\"frUnindexedDate4\":null,\"frUnindexedDate5\":null,\"frUnindexedInteger1\":null,\"frUnindexedInteger2\":null,\"frUnindexedInteger3\":null,\"frUnindexedInteger4\":null,\"frUnindexedInteger5\":null,\"frUnindexedMultivalued1\":[],\"frUnindexedMultivalued2\":[],\"frUnindexedMultivalued3\":[],\"frUnindexedMultivalued4\":[],\"frUnindexedMultivalued5\":[],\"frUnindexedString1\":null,\"frUnindexedString2\":null,\"frUnindexedString3\":null,\"frUnindexedString4\":null,\"frUnindexedString5\":null,\"givenName\":\"Will\",\"kbaInfo\":[],\"mail\":\"CapTest03022025LI3Gen_15402769@test.com\",\"memberOfOrgIDs\":[],\"postalAddress\":\"23 Lorne Road\",\"postalCode\":\"BA2 3BY\",\"preferences\":{\"ejpartnerupdateandoffer\":true,\"ejupdateandoffer\":true},\"profileImage\":null,\"sn\":\"West\",\"stateProvince\":\"UNDEFINED\",\"telephoneNumber\":null,\"userName\":\"CapTest03022025LI3Gen_15402769@test.com\"},\"changedFields\":[],\"eventName\":\"activity\",\"level\":\"INFO\",\"message\":\"\",\"objectId\":\"managed/alpha_user/b28e8d51-b392-44c8-bfa9-b031c3b8eb0c\",\"operation\":\"PATCH\",\"passwordChanged\":false,\"revision\":\"418f3aa6-1048-4a80-9484-2df27fcc36bf-858629\",\"runAs\":\"idm-provisioning\",\"source\":\"audit\",\"status\":\"SUCCESS\",\"timestamp\":\"2025-02-18T15:50:58.294Z\",\"topic\":\"activity\",\"transactionId\":\"e6b42174-165f-43db-b464-135e0f369f6f/0/5\",\"userId\":\"idm-provisioning\"},\"source\":\"idm-activity\",\"timestamp\":\"2025-02-18T15:50:58.29472009Z\",\"type\":\"application/json\"}","event":{"created":"2025-02-18T15:51:49.117Z","dataset":"httpjson.generic"},"data_stream":{"namespace":"default","type":"logs","dataset":"httpjson.generic"},"elastic_agent":{"version":"8.15.2","id":"52887bbd-0645-4c47-8e63-4095f2705fa8","snapshot":false},"host":{"hostname":"d1entsttlsr007.europe.aa.local","architecture":"x86_64","os":{"kernel":"3.10.0-1160.118.1.el7.x86_64","codename":"Maipo","type":"linux","platform":"rhel","version":"7.7 (Maipo)","family":"redhat","name":"Red Hat Enterprise Linux Server"},"id":"69f417b1a15a420eb9acfd36802ff697","containerized":false,"ip":["10.178.2.12","fe80::477:70ff:fefe:3fdb"],"mac":["06-77-70-FE-3F-DB"],"name":"d1entsttlsr007.europe.aa.local"}}

Regards
Ereek

Hello,

Can you provide more context? You have multiple filters, which one are not working?

Also, can you share the source message before being processed from logstash?

Hi @leandrojmp - The second log is not visible on Kibana, while we can see the first one. The issue seems to be because of the same fingerprint being generated for both these logs.

In the above chat, I have already shared the Kafka logs , where the 1st one is getting through to Kibana and 2nd is not.

Also, we are sending the logs from Elastic Agent -> Kafka -> Logstash -> Elasticsearch -> Kibana.

Are you using the fingerprint as a custom _id? You didn't share the full configuration so it is not possible to know.

Also, the fingerprint is a @metadata field, which is also not present in the samples you shared, so it is not clear what is the fingerprint.

Can you share the full configuration or at least the output?

Sorry for the confusion @leandrojmp .

PFB the full Logstash configuration.

input {
  kafka {
    bootstrap_servers => "----"
    topics             => ["aa_cm_test_app_topic"]
    codec              => "json"
    consumer_threads   => 3
    group_id           => "elastic-agent-consumer-group"
    security_protocol  => "SSL"
    ssl_truststore_location => "----/kafka.client.truststore.jks"
    ssl_truststore_password => "----"
  }
}

filter {
  # Parse the JSON string from the message field into "parsed_msg"
  json {
    source => "message"
    target => "parsed_msg"
    skip_on_invalid_json => true
  }

  # Process the payload: flatten if it's a hash, or store as a field if it's a string
  if [parsed_msg][payload] {
    ruby {
      code => '
        payload = event.get("parsed_msg")["payload"];
        if payload.is_a?(Hash)
          payload.each { |k, v|
            event.set("fr." + k, v)
          }
        else
          event.set("fr.payload", payload)
        end
      '
    }
  }

  # Extract parsed_msg.source to a top-level field "result_source"
  if [parsed_msg][source] {
    mutate {
      add_field => { "result_source" => "%{[parsed_msg][source]}" }
    }
  }

  # Extract parsed_msg.timestamp to a top-level field "result_timestamp"
  if [parsed_msg][timestamp] {
    mutate {
      add_field => { "result_timestamp" => "%{[parsed_msg][timestamp]}" }
    }
  }

  # Overwrite @timestamp with result_timestamp
  date {
    match => ["result_timestamp", "ISO8601"]
    target => "@timestamp"
  }

  # Generate a unique fingerprint based on the log type
  if [result_source] == "idm-core" {
    fingerprint {
      source => ["[parsed_msg][payload]", "result_source", "result_timestamp"]
      target => "[@metadata][fingerprint]"
      method => "SHA256"
      concatenate_sources => true
    }
  } else {
    fingerprint {
      source => ["fr._id", "fr.timestamp","result_source", "fr.message", "fr.transactionId", "fr.eventName", "result_timestamp","fr.after._id","fr.before._id"]
      target => "[@metadata][fingerprint]"
      method => "SHA256"
      concatenate_sources => true
    }
  }

  # Prune the event to keep only desired fields
  prune {
    whitelist_names => ["^fr.*$", "^@timestamp$", "^@metadata$", "^result_source$", "^result_timestamp$", "^tags$", "^fingerprint$","^fields.*$"]
  }

  # Remove the temporary parsed_msg field
  mutate {
    remove_field => ["parsed_msg"]
  }
}


output {
  if [result_source] == "am-access" {
    elasticsearch {
      index => "cm-test-am-access-app-%{+YYYY.MM.dd}"
      hosts => ["---"]
      api_key => ["----"]
      ssl => true
      ilm_rollover_alias => "cm-test-am-access-app"
      ilm_pattern => "000001"
      ilm_policy => "---"
      document_id => "%{[@metadata][fingerprint]}"
    }
  } else if [result_source] == "am-authentication" {
    elasticsearch {
      index => "cm-test-am-authentication-app-%{+YYYY.MM.dd}"
      hosts => ["---"]
      api_key => ["---"]
      ssl => true
      ilm_rollover_alias => "cm-test-am-authentication-app"
      ilm_pattern => "000001"
      ilm_policy => "---"
      document_id => "%{[@metadata][fingerprint]}"
    }
  }
  else if [result_source] == "am-config" {
    elasticsearch {
      index => "cm-test-am-config-app-%{+YYYY.MM.dd}"
      hosts => ["---"]
      api_key => ["---"]
      ssl => true
      ilm_rollover_alias => "cm-test-am-config-app"
      ilm_pattern => "000001"
      ilm_policy => "---"
      document_id => "%{[@metadata][fingerprint]}"
    }
  }
  else if [result_source] == "am-activity" {
    elasticsearch {
      index => "cm-test-am-activity-app-%{+YYYY.MM.dd}"
      hosts => ["---"]
      api_key => ["---"]
      ssl => true
      ilm_rollover_alias => "cm-test-am-activity-app"
      ilm_pattern => "000001"
      ilm_policy => "---"
      document_id => "%{[@metadata][fingerprint]}"
    }
  }
  else if [result_source] == "am-core" {
    elasticsearch {
      index => "cm-test-am-core-app-%{+YYYY.MM.dd}"
      hosts => ["---"]
      api_key => ["---"]
      ssl => true
      ilm_rollover_alias => "cm-test-am-core-app"
      ilm_pattern => "000001"
      ilm_policy => "---"
      document_id => "%{[@metadata][fingerprint]}"
    }
  } else if [result_source] == "idm-access" {
    elasticsearch {
      index => "cm-test-idm-access-app-%{+YYYY.MM.dd}"
      hosts => ["---"]
      api_key => ["---"]
      ssl => true
      ilm_rollover_alias => "cm-test-idm-access-app"
      ilm_pattern => "000001"
      ilm_policy => "---"
      document_id => "%{[@metadata][fingerprint]}"
    }
  }
  else if [result_source] == "idm-activity" {
    elasticsearch {
      index => "cm-test-idm-activity-app-%{+YYYY.MM.dd}"
      hosts => ["---"]
      api_key => ["---"]
      ssl => true
      ilm_rollover_alias => "cm-test-idm-activity-app"
      ilm_pattern => "000001"
      ilm_policy => "---"
      document_id => "%{[@metadata][fingerprint]}"
    }
  }
  else if [result_source] == "idm-authentication" {
    elasticsearch {
      index => "cm-test-idm-authentication-app-%{+YYYY.MM.dd}"
      hosts => ["---"]
      api_key => ["---"]
      ssl => true
      ilm_rollover_alias => "cm-test-idm-authentication-app"
      ilm_pattern => "000001"
      ilm_policy => "---"
      document_id => "%{[@metadata][fingerprint]}"
    }
  }
  else if [result_source] == "idm-config" {
    elasticsearch {
      index => "cm-test-idm-config-app-%{+YYYY.MM.dd}"
      hosts => ["---"]
      api_key => ["---"]
      ssl => true
      ilm_rollover_alias => "cm-test-idm-config-app"
      ilm_pattern => "000001"
      ilm_policy => "---"
      document_id => "%{[@metadata][fingerprint]}"
    }
  }
  else if [result_source] == "idm-core" {
    elasticsearch {
      index => "cm-test-idm-core-app-%{+YYYY.MM.dd}"
      hosts => ["---"]
      api_key => ["---"]
      ssl => true
      ilm_rollover_alias => "cm-test-idm-core-app"
      ilm_pattern => "000001"
      ilm_policy => "---"
      document_id => "%{[@metadata][fingerprint]}"
    }
  } else {
    elasticsearch {
      index => "cm-test-default-app-%{+YYYY.MM.dd}"
      hosts => ["---"]
      api_key => ["---"]
      ssl => true
      ilm_rollover_alias => "cm-test-default-app"
      ilm_pattern => "000001"
      ilm_policy => "---"
      document_id => "%{[@metadata][fingerprint]}"
    }
  }
}

Where did you get that the fingerprint is the same? You are talking about the [@metadata][fingerprint] field right?

They are not the same.

Do you have any error in logstash regarding mapping issues? I'm trying to understand what could be the issue here but being honest using literal dots in the fields names makes the document a little confusing.

Yes @leandrojmp . The missing logs are moving the the DLQ error folder.

It is showing the below error:

RubyStringePATCH▒mresult_source▒torg.jruby.RubyStringlidm-activity▒kfr.objectId▒torg.jruby.RubyStringx;managed/alpha_usermeta/db283b4b-fd08-4326-afa4-dae32d23e9b1▒kfr.revision▒torg.jruby.RubyStringx+418f3aa6-1048-4a80-9484-2df27fcc36bf-130371▒ifr.source▒torg.jruby.RubyStringeaudit▒jfr.message▒torg.jruby.RubyString`▒hfr.after▒xorg.logstash.ConvertedMap▒klastChanged▒xorg.logstash.ConvertedMap▒ddate▒torg.jruby.RubyStringx2025-02-14T14:58:11.470548473Z▒▒▒d_rev▒torg.jruby.RubyStringx+418f3aa6-1048-4a80-9484-2df27fcc36bf-130371▒c_id▒torg.jruby.RubyStringx$db283b4b-fd08-4326-afa4-dae32d23e9b1▒jcreateDate▒torg.jruby.RubyStringx2025-02-04T14:15:44.523314187Z▒jloginCount▒j@timestamp▒vorg.logstash.Timestampx2025-02-14T14:58:11.480Z▒presult_timestamp▒torg.jruby.RubyStringx2025-02-14T14:58:11.480695696Z▒hfr.topic▒torg.jruby.RubyStringhactivity▒ffr._id▒torg.jruby.RubyStringx,185292be-0ecc-4301-88fd-98eac8de94f3-1579910▒▒▒dMETA▒xorg.logstash.ConvertedMap▒kfingerprint▒torg.jruby.RubyStringx@cc3ad818fa506c45cdc80fc640ff435b027d7b98d408cbe7f4540cf89398f57c▒istream_id▒torg.jruby.RubyStringxhttpjson-httpjson.staging_idm▒gversion▒torg.jruby.RubyStringf8.15.2▒dtype▒torg.jruby.RubyStringd_doc▒iraw_index▒torg.jruby.RubyStringxlogs-httpjson.elasticsearch@f0a3c3e8af172dc6753ee5c14b4e54fe60fefe7b87cc9aa6e0e12f6d4e5d454dCould not index event to Elasticsearch. status: 400, action: ["index", {:_id=>"cc3ad818fa506c45cdc80fc640ff435b027d7b98d408cbe7f4540cf89398f57c", :_index=>"cm-test-default-app", :routing=>nil}, {"fr.level"=>"INFO", "fr.userId"=>"idm-provisioning", "tags"=>["staging_idm"], "fr.changedFields"=>[], "fr.status"=>"SUCCESS", "fr.transactionId"=>"1d2c355b-0db2-4d86-8d35-ca1faefda277/0/5", "fr.runAs"=>"idm-provisioning", "fr.eventName"=>"activity", "fr.passwordChanged"=>false, "fr.before"=>{"lastChanged"=>{"date"=>"2025-02-14T14:58:10.698673841Z"}, "_rev"=>"418f3aa6-1048-4a80-9484-2df27fcc36bf-130282", "_id"=>"db283b4b-fd08-4326-afa4-dae32d23e9b1", "createDate"=>"2025-02-04T14:15:44.523314187Z", "loginCount"=>0}, "fr.timestamp"=>"2025-02-14T14:58:11.480Z", "fields"=>{"environment"=>"staging"}, "fr.operation"=>"PATCH", "result_source"=>"idm-activity", "fr.objectId"=>"managed/alpha_usermeta/db283b4b-fd08-4326-afa4-dae32d23e9b1", "fr.revision"=>"418f3aa6-1048-4a80-9484-2df27fcc36bf-130371", "fr.source"=>"audit", "fr.message"=>"", "fr.after"=>{"lastChanged"=>{"date"=>"2025-02-14T14:58:11.470548473Z"}, "_rev"=>"418f3aa6-1048-4a80-9484-2df27fcc36bf-130371", "_id"=>"db283b4b-fd08-4326-afa4-dae32d23e9b1", "createDate"=>"2025-02-04T14:15:44.523314187Z", "loginCount"=>0}, "@timestamp"=>2025-02-14T14:58:11.480Z, "result_timestamp"=>"2025-02-14T14:58:11.480695696Z", "fr.topic"=>"activity", "fr._id"=>"185292be-0ecc-4301-88fd-98eac8de94f3-1579910"}], response: {"index"=>{"status"=>400, "error"=>{"type"=>"document_parsing_exception", "reason"=>"[1:558] failed to parse: Limit of total fields [1000] has been exceeded while adding new fields [3]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Limit of total fields [1000] has been exceeded while adding new fields [3]"}}}}▒5▒▒/2025-02-14T15:01:02.359103930Z j▒qjava.util.HashMap▒dDATA▒xorg.logstash.ConvertedMap▒hfr.level▒torg.jruby.RubyStringdINFO▒ifr.userId▒torg.jruby.RubyStringpidm-provisioning▒dtags▒xorg.logstash.ConvertedList▒▒torg.jruby.RubyStringkstaging_idm▒▒▒pfr.changedFields▒xorg.logstash.ConvertedList▒▒▒ifr.status▒torg.jruby.RubyStringgSUCCESS▒pfr.transactionId▒torg.jruby.RubyStringx(1d2c355b-0db2-4d86-8d35-ca1faefda277/0/5▒hfr.runAs▒torg.jruby.RubyStringpidm-provisioning▒lfr.eventName▒torg.jruby.RubyStringhactivity▒rfr.passwordChanged▒ifr.before▒xorg.logstash.ConvertedMap▒ueffectiveApplications▒xorg.logstash.ConvertedList▒▒▒rcustom_mobilephone▒torg.jruby.RubyStringk18776406853▒ucustom_languageCodeID▒torg.jruby.RubyStringeen-gb▒ufrIndexedMultivalued5▒xorg.logstash.ConvertedList▒▒▒qfrIndexedInteger1▒qorg.jruby.RubyNil▒▒sfrUnindexedInteger2▒qorg.jruby.RubyNil▒▒wfrUnindexedMultivalued5▒xorg.logstash.ConvertedList▒▒▒wfrUnindexedMultivalued1▒xorg.logstash.ConvertedList▒▒▒ufrIndexedMultivalued2▒xorg.logstash.ConvertedList▒▒▒qfrIndexedInteger4▒qorg.jruby.RubyNil▒▒gkbaInfo▒xorg.logstash.ConvertedList▒▒▒jpostalCode▒torg.jruby.RubyStringgBA2 

Yeah, but do you have the logs, not the messages in DLQ?

From what I was able to see you are facing map explosions issues.

response: {"index"=>{"status"=>400, "error"=>{"type"=>"document_parsing_exception", "reason"=>"[1:558] failed to parse: Limit of total fields [1000] has been exceeded while adding new fields [3]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Limit of total fields [1000] has been exceeded while adding new fields [3]"}}}

That configuration doesn't work for me. It results in two empty events because you prune almost everything away. Also none of the fields that you fingerprint exist, so every event has the same fingerprint.

If I add

    json {
        source => "[parsed_msg][message]"
        target => "parsed_msg"
    }

then the first event is still empty, because there is no [parsed_msg][message] field that contains the payload, but the second looks OK and has a different fingerprint

{
"@timestamp" => 2025-02-20T15:47:36.403311855Z,
 "@metadata" => {
    "fingerprint" => "cb299e4eddcf9a02adb7995d35c780e8eb20522fea61c421bac2bccb8da5a82e",
           "path" => "....",
           "host" => "...."
}
}
{
          "fr.runAs" => "idm-provisioning",
         "fr.status" => "SUCCESS",
         "fr.before" => {
                    "country" => "United Kingdom",
                       "mail" => "CapTest03022025LI3Gen_15402769@test.com",
         "frUnindexedString1" => nil,
             "memberOfOrgIDs" => [],
             "frIndexedDate5" => nil,
         "frUnindexedString2" => nil,
             "frIndexedDate4" => nil,
                 "custom_IDD" => "undefined",

@leandrojmp - I am only getting the below error on the logstash logs in this pipeline :

[2025-02-20T12:55:01,260][ERROR][org.logstash.common.io.DeadLetterQueueWriter][aa_cm_app_pipeline][58be2d66786fd816eee5e3ce8ba9b4ea9a96aa285591c1a3f3e024ad1045ae74] Cannot write event to DLQ(path: /data/logstash/dead_letter_queue/aa_cm_app_pipeline): reached maxQueueSize of 1073741824

So @Badger - you mean to say that if we add one more json in the filter, it will give us the 2nd log? The first log is already coming for me on Kibana.

No, I am not saying that. I have no clue how the filter configuration you showed could produce a useful event from the first message. It does not do so for me. If it does so for you then I wouldn't suggest making the change that made it work for me, since that might well stop it working for you.

@Badger @leandrojmp - Could you please suggest any alternative filter to get both the logs through to Kibana?